Cellular gene profiling may predict IBD treatment response

These insights are an important start
Article Type
Changed
Tue, 06/07/2022 - 10:34

Transcriptomic profiling of phagocytes in the lamina propria of patients with inflammatory bowel disease (IBD) may guide future treatment selection, according to investigators.

Mucosal gut biopsies revealed that phagocytic gene expression correlated with inflammatory states, types of IBD, and responses to therapy, lead author Gillian E. Jacobsen a MD/PhD candidate at the University of Miami and colleagues reported.

In an article in Gastro Hep Advances, the investigators wrote that “lamina propria phagocytes along with epithelial cells represent a first line of defense and play a balancing act between tolerance toward commensal microbes and generation of immune responses toward pathogenic microorganisms. ... Inappropriate responses by lamina propria phagocytes have been linked to IBD.”

To better understand these responses, the researchers collected 111 gut mucosal biopsies from 54 patients with IBD, among whom 59% were taking biologics, 72% had inflammation in at least one biopsy site, and 41% had previously used at least one other biologic. Samples were analyzed to determine cell phenotypes, gene expression, and cytokine responses to in vitro Janus kinase (JAK) inhibitor exposure.

Ms. Jacobsen and colleagues noted that most reports that address the function of phagocytes focus on circulating dendritic cells, monocytes, or monocyte-derived macrophages, rather than on resident phagocyte populations located in the lamina propria. However, these circulating cells “do not reflect intestinal inflammation, or whole tissue biopsies.”

Phagocytes based on CD11b expression and phenotyped CD11b+-enriched cells using flow cytometry were identified. In samples with active inflammation, cells were most often granulocytes (45.5%), followed by macrophages (22.6%) and monocytes (9.4%). Uninflamed samples had a slightly lower proportion of granulocytes (33.6%), about the same proportion of macrophages (22.7%), and a higher rate of B cells (15.6% vs. 9.0%).

Ms. Jacobsen and colleagues highlighted the absolute uptick in granulocytes, including neutrophils.

“Neutrophilic infiltration is a major indicator of IBD activity and may be critically linked to ongoing inflammation,” they wrote. “These data demonstrate that CD11b+ enrichment reflects the inflammatory state of the biopsies.”

The investigators also showed that transcriptional profiles of lamina propria CD11b+ cells differed “greatly” between colon and ileum, which suggested that “the location or cellular environment plays a marked role in determining the gene expression of phagocytes.”

CD11b+ cell gene expression profiles also correlated with ulcerative colitis versus Crohn’s disease, although the researchers noted that these patterns were less pronounced than correlations with inflammatory states

“There are pathways common to inflammation regardless of the IBD type that could be used as markers of inflammation or targets for therapy.”

Comparing colon samples from patients who responded to anti–tumor necrosis factor therapy with those who were refractory to anti-TNF therapy revealed significant associations between response type and 52 differentially expressed genes.

“These genes were mostly immunoglobulin genes up-regulated in the anti–TNF-treated inflamed colon, suggesting that CD11b+ B cells may play a role in medication refractoriness.”

Evaluating inflamed colon and anti-TNF refractory ileum revealed differential expression of OSM, a known marker of TNF-resistant disease, as well as TREM1, a proinflammatory marker. In contrast, NTS genes showed high expression in uninflamed samples on anti-TNF therapy. The researchers noted that these findings “may be used to build precision medicine approaches in IBD.”

Further experiments showed that in vitro exposure of anti-TNF refractory samples to JAK inhibitors resulted in significantly reduced secretion of interleukin-8 and TNF-alpha.

“Our study provides functional data that JAK inhibition with tofacitinib (JAK1/JAK3) or ruxolitinib (JAK1/JAK2) inhibits lipopolysaccharide-induced cytokine production even in TNF-refractory samples,” the researchers wrote. “These data inform the response of patients to JAK inhibitors, including those refractory to other treatments.”

The study was supported by Pfizer, the National Institute of Diabetes and Digestive and Kidney Diseases, the Micky & Madeleine Arison Family Foundation Crohn’s & Colitis Discovery Laboratory, and Martin Kalser Chair in Gastroenterology at the University of Miami. The investigators disclosed additional relationships with Takeda, Abbvie, Eli Lilly, and others.

Body

Inflammatory bowel diseases are complex and heterogenous disorders driven by inappropriate immune responses to luminal substances, including diet and microbes, resulting in chronic inflammation of the gastrointestinal tract. Therapies for IBD largely center around suppressing immune responses; however, given the complexity and heterogeneity of these diseases, consensus on which aspect of the immune response to suppress and which cell type to target in a given patient is unclear.

Dr. Sreeram Udayan
In this study, Jacobsen et al. profiled CD11b+ lamina propria phagocytes from biopsy specimens of patients with IBD and identified genes differentially expressed depending on the inflammation status (uninflamed vs. inflamed), tissue type (colon vs. ileum), and the type of IBD (ulcerative colitis vs. Crohn’s disease). This study is notable in that it studied CD11b+ cells from the gut, as opposed to many studies examining circulating cellular populations, and evaluated the response of these resident populations to emerging therapies for IBD. The authors find that even in patients refractory to anti-TNF-alpha therapy, the most common biologic used for IBD, CD11b+ cellular populations can be modulated, and inflammatory responses suppressed with Janus kinase inhibitors in in vitro studies, which suggests that this may be a therapeutic approach for this difficult-to-treat patient population. Beyond these objective observations, this study also could foreshadow future approaches to use intestinal biopsies to tailor immunotherapies for personalized therapy for IBD particularly in difficult to treat refractory cases.

Sreeram Udayan, PhD, and Rodney D. Newberry, MD, are with the division of gastroenterology in the department of medicine at Washington University, St. Louis.

Publications
Topics
Sections
Body

Inflammatory bowel diseases are complex and heterogenous disorders driven by inappropriate immune responses to luminal substances, including diet and microbes, resulting in chronic inflammation of the gastrointestinal tract. Therapies for IBD largely center around suppressing immune responses; however, given the complexity and heterogeneity of these diseases, consensus on which aspect of the immune response to suppress and which cell type to target in a given patient is unclear.

Dr. Sreeram Udayan
In this study, Jacobsen et al. profiled CD11b+ lamina propria phagocytes from biopsy specimens of patients with IBD and identified genes differentially expressed depending on the inflammation status (uninflamed vs. inflamed), tissue type (colon vs. ileum), and the type of IBD (ulcerative colitis vs. Crohn’s disease). This study is notable in that it studied CD11b+ cells from the gut, as opposed to many studies examining circulating cellular populations, and evaluated the response of these resident populations to emerging therapies for IBD. The authors find that even in patients refractory to anti-TNF-alpha therapy, the most common biologic used for IBD, CD11b+ cellular populations can be modulated, and inflammatory responses suppressed with Janus kinase inhibitors in in vitro studies, which suggests that this may be a therapeutic approach for this difficult-to-treat patient population. Beyond these objective observations, this study also could foreshadow future approaches to use intestinal biopsies to tailor immunotherapies for personalized therapy for IBD particularly in difficult to treat refractory cases.

Sreeram Udayan, PhD, and Rodney D. Newberry, MD, are with the division of gastroenterology in the department of medicine at Washington University, St. Louis.

Body

Inflammatory bowel diseases are complex and heterogenous disorders driven by inappropriate immune responses to luminal substances, including diet and microbes, resulting in chronic inflammation of the gastrointestinal tract. Therapies for IBD largely center around suppressing immune responses; however, given the complexity and heterogeneity of these diseases, consensus on which aspect of the immune response to suppress and which cell type to target in a given patient is unclear.

Dr. Sreeram Udayan
In this study, Jacobsen et al. profiled CD11b+ lamina propria phagocytes from biopsy specimens of patients with IBD and identified genes differentially expressed depending on the inflammation status (uninflamed vs. inflamed), tissue type (colon vs. ileum), and the type of IBD (ulcerative colitis vs. Crohn’s disease). This study is notable in that it studied CD11b+ cells from the gut, as opposed to many studies examining circulating cellular populations, and evaluated the response of these resident populations to emerging therapies for IBD. The authors find that even in patients refractory to anti-TNF-alpha therapy, the most common biologic used for IBD, CD11b+ cellular populations can be modulated, and inflammatory responses suppressed with Janus kinase inhibitors in in vitro studies, which suggests that this may be a therapeutic approach for this difficult-to-treat patient population. Beyond these objective observations, this study also could foreshadow future approaches to use intestinal biopsies to tailor immunotherapies for personalized therapy for IBD particularly in difficult to treat refractory cases.

Sreeram Udayan, PhD, and Rodney D. Newberry, MD, are with the division of gastroenterology in the department of medicine at Washington University, St. Louis.

Title
These insights are an important start
These insights are an important start

Transcriptomic profiling of phagocytes in the lamina propria of patients with inflammatory bowel disease (IBD) may guide future treatment selection, according to investigators.

Mucosal gut biopsies revealed that phagocytic gene expression correlated with inflammatory states, types of IBD, and responses to therapy, lead author Gillian E. Jacobsen a MD/PhD candidate at the University of Miami and colleagues reported.

In an article in Gastro Hep Advances, the investigators wrote that “lamina propria phagocytes along with epithelial cells represent a first line of defense and play a balancing act between tolerance toward commensal microbes and generation of immune responses toward pathogenic microorganisms. ... Inappropriate responses by lamina propria phagocytes have been linked to IBD.”

To better understand these responses, the researchers collected 111 gut mucosal biopsies from 54 patients with IBD, among whom 59% were taking biologics, 72% had inflammation in at least one biopsy site, and 41% had previously used at least one other biologic. Samples were analyzed to determine cell phenotypes, gene expression, and cytokine responses to in vitro Janus kinase (JAK) inhibitor exposure.

Ms. Jacobsen and colleagues noted that most reports that address the function of phagocytes focus on circulating dendritic cells, monocytes, or monocyte-derived macrophages, rather than on resident phagocyte populations located in the lamina propria. However, these circulating cells “do not reflect intestinal inflammation, or whole tissue biopsies.”

Phagocytes based on CD11b expression and phenotyped CD11b+-enriched cells using flow cytometry were identified. In samples with active inflammation, cells were most often granulocytes (45.5%), followed by macrophages (22.6%) and monocytes (9.4%). Uninflamed samples had a slightly lower proportion of granulocytes (33.6%), about the same proportion of macrophages (22.7%), and a higher rate of B cells (15.6% vs. 9.0%).

Ms. Jacobsen and colleagues highlighted the absolute uptick in granulocytes, including neutrophils.

“Neutrophilic infiltration is a major indicator of IBD activity and may be critically linked to ongoing inflammation,” they wrote. “These data demonstrate that CD11b+ enrichment reflects the inflammatory state of the biopsies.”

The investigators also showed that transcriptional profiles of lamina propria CD11b+ cells differed “greatly” between colon and ileum, which suggested that “the location or cellular environment plays a marked role in determining the gene expression of phagocytes.”

CD11b+ cell gene expression profiles also correlated with ulcerative colitis versus Crohn’s disease, although the researchers noted that these patterns were less pronounced than correlations with inflammatory states

“There are pathways common to inflammation regardless of the IBD type that could be used as markers of inflammation or targets for therapy.”

Comparing colon samples from patients who responded to anti–tumor necrosis factor therapy with those who were refractory to anti-TNF therapy revealed significant associations between response type and 52 differentially expressed genes.

“These genes were mostly immunoglobulin genes up-regulated in the anti–TNF-treated inflamed colon, suggesting that CD11b+ B cells may play a role in medication refractoriness.”

Evaluating inflamed colon and anti-TNF refractory ileum revealed differential expression of OSM, a known marker of TNF-resistant disease, as well as TREM1, a proinflammatory marker. In contrast, NTS genes showed high expression in uninflamed samples on anti-TNF therapy. The researchers noted that these findings “may be used to build precision medicine approaches in IBD.”

Further experiments showed that in vitro exposure of anti-TNF refractory samples to JAK inhibitors resulted in significantly reduced secretion of interleukin-8 and TNF-alpha.

“Our study provides functional data that JAK inhibition with tofacitinib (JAK1/JAK3) or ruxolitinib (JAK1/JAK2) inhibits lipopolysaccharide-induced cytokine production even in TNF-refractory samples,” the researchers wrote. “These data inform the response of patients to JAK inhibitors, including those refractory to other treatments.”

The study was supported by Pfizer, the National Institute of Diabetes and Digestive and Kidney Diseases, the Micky & Madeleine Arison Family Foundation Crohn’s & Colitis Discovery Laboratory, and Martin Kalser Chair in Gastroenterology at the University of Miami. The investigators disclosed additional relationships with Takeda, Abbvie, Eli Lilly, and others.

Transcriptomic profiling of phagocytes in the lamina propria of patients with inflammatory bowel disease (IBD) may guide future treatment selection, according to investigators.

Mucosal gut biopsies revealed that phagocytic gene expression correlated with inflammatory states, types of IBD, and responses to therapy, lead author Gillian E. Jacobsen a MD/PhD candidate at the University of Miami and colleagues reported.

In an article in Gastro Hep Advances, the investigators wrote that “lamina propria phagocytes along with epithelial cells represent a first line of defense and play a balancing act between tolerance toward commensal microbes and generation of immune responses toward pathogenic microorganisms. ... Inappropriate responses by lamina propria phagocytes have been linked to IBD.”

To better understand these responses, the researchers collected 111 gut mucosal biopsies from 54 patients with IBD, among whom 59% were taking biologics, 72% had inflammation in at least one biopsy site, and 41% had previously used at least one other biologic. Samples were analyzed to determine cell phenotypes, gene expression, and cytokine responses to in vitro Janus kinase (JAK) inhibitor exposure.

Ms. Jacobsen and colleagues noted that most reports that address the function of phagocytes focus on circulating dendritic cells, monocytes, or monocyte-derived macrophages, rather than on resident phagocyte populations located in the lamina propria. However, these circulating cells “do not reflect intestinal inflammation, or whole tissue biopsies.”

Phagocytes based on CD11b expression and phenotyped CD11b+-enriched cells using flow cytometry were identified. In samples with active inflammation, cells were most often granulocytes (45.5%), followed by macrophages (22.6%) and monocytes (9.4%). Uninflamed samples had a slightly lower proportion of granulocytes (33.6%), about the same proportion of macrophages (22.7%), and a higher rate of B cells (15.6% vs. 9.0%).

Ms. Jacobsen and colleagues highlighted the absolute uptick in granulocytes, including neutrophils.

“Neutrophilic infiltration is a major indicator of IBD activity and may be critically linked to ongoing inflammation,” they wrote. “These data demonstrate that CD11b+ enrichment reflects the inflammatory state of the biopsies.”

The investigators also showed that transcriptional profiles of lamina propria CD11b+ cells differed “greatly” between colon and ileum, which suggested that “the location or cellular environment plays a marked role in determining the gene expression of phagocytes.”

CD11b+ cell gene expression profiles also correlated with ulcerative colitis versus Crohn’s disease, although the researchers noted that these patterns were less pronounced than correlations with inflammatory states

“There are pathways common to inflammation regardless of the IBD type that could be used as markers of inflammation or targets for therapy.”

Comparing colon samples from patients who responded to anti–tumor necrosis factor therapy with those who were refractory to anti-TNF therapy revealed significant associations between response type and 52 differentially expressed genes.

“These genes were mostly immunoglobulin genes up-regulated in the anti–TNF-treated inflamed colon, suggesting that CD11b+ B cells may play a role in medication refractoriness.”

Evaluating inflamed colon and anti-TNF refractory ileum revealed differential expression of OSM, a known marker of TNF-resistant disease, as well as TREM1, a proinflammatory marker. In contrast, NTS genes showed high expression in uninflamed samples on anti-TNF therapy. The researchers noted that these findings “may be used to build precision medicine approaches in IBD.”

Further experiments showed that in vitro exposure of anti-TNF refractory samples to JAK inhibitors resulted in significantly reduced secretion of interleukin-8 and TNF-alpha.

“Our study provides functional data that JAK inhibition with tofacitinib (JAK1/JAK3) or ruxolitinib (JAK1/JAK2) inhibits lipopolysaccharide-induced cytokine production even in TNF-refractory samples,” the researchers wrote. “These data inform the response of patients to JAK inhibitors, including those refractory to other treatments.”

The study was supported by Pfizer, the National Institute of Diabetes and Digestive and Kidney Diseases, the Micky & Madeleine Arison Family Foundation Crohn’s & Colitis Discovery Laboratory, and Martin Kalser Chair in Gastroenterology at the University of Miami. The investigators disclosed additional relationships with Takeda, Abbvie, Eli Lilly, and others.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTRO HEP ADVANCES

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Deep learning system outmatches pathologists in diagnosing liver lesions

Work smarter not harder
Article Type
Changed
Thu, 06/09/2022 - 13:37

A new deep learning system can classify hepatocellular nodular lesions (HNLs) via whole-slide images, improving risk stratification of patients and diagnostic rate of hepatocellular carcinoma (HCC), according to investigators.

While the model requires further validation, it could eventually be used to optimize accuracy and efficiency of histologic diagnoses, potentially decreasing reliance on pathologists, particularly in areas with limited access to subspecialists.

In an article published in Gastroenterology, Na Cheng, MD, of Sun Yat-sen University, Guangzhou, China, and colleagues wrote that the “diagnostic process [for HNLs] is laborious, time-consuming, and subject to the experience of the pathologists, often with significant interobserver and intraobserver variability. ... Therefore, [an] automated analysis system is highly demanded in the pathology field, which could considerably ease the workload, speed up the diagnosis, and facilitate the in-time treatment.”

To this end, Dr. Cheng and colleagues developed the hepatocellular-nodular artificial intelligence model (HnAIM) that can scan whole-image slides to identify seven types of tissue: well-differentiated HCC, high-grade dysplastic nodules, low-grade dysplastic nodules, hepatocellular adenoma, focal nodular hyperplasia, and background tissue.

Developing and testing HnAIM was a multistep process that began with three subspecialist pathologists, who independently reviewed and classified liver slides from surgical resection. Unanimous agreement was achieved in 649 slides from 462 patients. These slides were then scanned to create whole-slide images, which were divided into sets for training (70%), validation (15%), and internal testing (15%). Accuracy, measured by area under the curve (AUC), was over 99.9% for the internal testing set. The accuracy of HnAIM was independently, externally validated.

First, HnAIM evaluated liver biopsy slides from 30 patients at one center. Results were compared with diagnoses made by nine pathologists classified as either senior, intermediate, or junior. While HnAIM correctly diagnosed 100% of the cases, senior pathologists correctly diagnosed 94.4% of the cases, followed in accuracy by intermediate (86.7%) and junior (73.3%) pathologists.

The researchers noted that the “rate of agreement with subspecialists was higher for HnAIM than for all 9 pathologists at distinguishing 7 liver tissues, with important diagnostic implications for fragmentary or scarce biopsy specimens.”

Next, HnAIM evaluated 234 samples from three hospitals. Accuracy was slightly lower, with an AUC of 93.5%. The researchers highlighted how HnAIM consistently differentiated precancerous lesions and well-defined HCC from benign lesions and background tissues.

A final experiment showed how HnAIM reacted to the most challenging cases. The investigators selected 12 cases without definitive diagnoses and found that, similar to the findings of three subspecialist pathologists, HnAIM did not reach a single diagnostic conclusion.

The researchers reported that “This may be due to a number of potential reasons, such as inherent uncertainty in the 2-dimensional interpretation of a 3-dimensional specimen, the limited number of tissue samples, and cognitive factors such as anchoring.”

However, HnAIM contributed to the diagnostic process by generating multiple diagnostic possibilities with weighted likelihood. After reviewing these results, the expert pathologists reached consensus in 5 out of 12 cases. Moreover, two out of three expert pathologists agreed on all 12 cases, improving agreement rate from 25% to 100%.

The researchers concluded that the model holds the promise to facilitate human HNL diagnoses and improve efficiency and quality. It can also reduce the workload of pathologists, especially where subspecialists are unavailable.

The study was supported by the National Natural Science Foundation of China, the Guangdong Basic and Applied Basic Research Foundation, the Natural Science Foundation of Guangdong Province, and others. The investigators reported no conflicts of interest.

Body

As the prevalence of hepatocellular carcinoma (HCC) continues to rise, the early and accurate detection and diagnosis of HCC remains paramount to improving patient outcomes. In cases of typical or advanced HCC, an accurate diagnosis is made using CT or MR imaging. However, hepatocellular nodular lesions (HNLs) with atypical or inconclusive radiographic appearances are often biopsied to achieve a histopathologic diagnosis. In addition, accurate diagnosis of an HNL following liver resection or transplantation is important to long-term surveillance and management. An accurate histopathologic diagnosis relies on the availability of experienced subspecialty pathologists and remains a costly and labor-intensive process that can lead to delays in diagnosis and care.

Dr. Hannah P. Kim
In this study, Cheng et al. developed a deep learning system to differentiate histopathologic diagnoses of various HNLs, normal liver, and cirrhosis. Their model, hepatocellular-nodular artificial intelligence model (HnAIM), accurately classified various liver histology slides with an AUC of 93.5% using an external validation cohort. When compared to even the most experienced subspecialty pathologists, HnAIM demonstrated superior HNL histopathologic diagnostic accuracy. Utilization of HnAIM to either make or aid in the diagnosis of HNLs can lead to more accurate diagnoses in a more efficient and timely manner and has the potential to provide subspecialty care in areas that lack subspecialty pathologists. If this model is further validated, HnAIM may be used to improve the quality of care we are able to provide to our patients, ultimately with the ability to improve our diagnosis of HCC, prevent delays in treatment, and improve patient outcomes.

Hannah P. Kim, MD, MSCR, is an assistant professor in the division of gastroenterology, hepatology, and nutrition in the department of medicine at Vanderbilt University Medical Center, Nashville, Tenn. She has no conflicts of interest.

Publications
Topics
Sections
Body

As the prevalence of hepatocellular carcinoma (HCC) continues to rise, the early and accurate detection and diagnosis of HCC remains paramount to improving patient outcomes. In cases of typical or advanced HCC, an accurate diagnosis is made using CT or MR imaging. However, hepatocellular nodular lesions (HNLs) with atypical or inconclusive radiographic appearances are often biopsied to achieve a histopathologic diagnosis. In addition, accurate diagnosis of an HNL following liver resection or transplantation is important to long-term surveillance and management. An accurate histopathologic diagnosis relies on the availability of experienced subspecialty pathologists and remains a costly and labor-intensive process that can lead to delays in diagnosis and care.

Dr. Hannah P. Kim
In this study, Cheng et al. developed a deep learning system to differentiate histopathologic diagnoses of various HNLs, normal liver, and cirrhosis. Their model, hepatocellular-nodular artificial intelligence model (HnAIM), accurately classified various liver histology slides with an AUC of 93.5% using an external validation cohort. When compared to even the most experienced subspecialty pathologists, HnAIM demonstrated superior HNL histopathologic diagnostic accuracy. Utilization of HnAIM to either make or aid in the diagnosis of HNLs can lead to more accurate diagnoses in a more efficient and timely manner and has the potential to provide subspecialty care in areas that lack subspecialty pathologists. If this model is further validated, HnAIM may be used to improve the quality of care we are able to provide to our patients, ultimately with the ability to improve our diagnosis of HCC, prevent delays in treatment, and improve patient outcomes.

Hannah P. Kim, MD, MSCR, is an assistant professor in the division of gastroenterology, hepatology, and nutrition in the department of medicine at Vanderbilt University Medical Center, Nashville, Tenn. She has no conflicts of interest.

Body

As the prevalence of hepatocellular carcinoma (HCC) continues to rise, the early and accurate detection and diagnosis of HCC remains paramount to improving patient outcomes. In cases of typical or advanced HCC, an accurate diagnosis is made using CT or MR imaging. However, hepatocellular nodular lesions (HNLs) with atypical or inconclusive radiographic appearances are often biopsied to achieve a histopathologic diagnosis. In addition, accurate diagnosis of an HNL following liver resection or transplantation is important to long-term surveillance and management. An accurate histopathologic diagnosis relies on the availability of experienced subspecialty pathologists and remains a costly and labor-intensive process that can lead to delays in diagnosis and care.

Dr. Hannah P. Kim
In this study, Cheng et al. developed a deep learning system to differentiate histopathologic diagnoses of various HNLs, normal liver, and cirrhosis. Their model, hepatocellular-nodular artificial intelligence model (HnAIM), accurately classified various liver histology slides with an AUC of 93.5% using an external validation cohort. When compared to even the most experienced subspecialty pathologists, HnAIM demonstrated superior HNL histopathologic diagnostic accuracy. Utilization of HnAIM to either make or aid in the diagnosis of HNLs can lead to more accurate diagnoses in a more efficient and timely manner and has the potential to provide subspecialty care in areas that lack subspecialty pathologists. If this model is further validated, HnAIM may be used to improve the quality of care we are able to provide to our patients, ultimately with the ability to improve our diagnosis of HCC, prevent delays in treatment, and improve patient outcomes.

Hannah P. Kim, MD, MSCR, is an assistant professor in the division of gastroenterology, hepatology, and nutrition in the department of medicine at Vanderbilt University Medical Center, Nashville, Tenn. She has no conflicts of interest.

Title
Work smarter not harder
Work smarter not harder

A new deep learning system can classify hepatocellular nodular lesions (HNLs) via whole-slide images, improving risk stratification of patients and diagnostic rate of hepatocellular carcinoma (HCC), according to investigators.

While the model requires further validation, it could eventually be used to optimize accuracy and efficiency of histologic diagnoses, potentially decreasing reliance on pathologists, particularly in areas with limited access to subspecialists.

In an article published in Gastroenterology, Na Cheng, MD, of Sun Yat-sen University, Guangzhou, China, and colleagues wrote that the “diagnostic process [for HNLs] is laborious, time-consuming, and subject to the experience of the pathologists, often with significant interobserver and intraobserver variability. ... Therefore, [an] automated analysis system is highly demanded in the pathology field, which could considerably ease the workload, speed up the diagnosis, and facilitate the in-time treatment.”

To this end, Dr. Cheng and colleagues developed the hepatocellular-nodular artificial intelligence model (HnAIM) that can scan whole-image slides to identify seven types of tissue: well-differentiated HCC, high-grade dysplastic nodules, low-grade dysplastic nodules, hepatocellular adenoma, focal nodular hyperplasia, and background tissue.

Developing and testing HnAIM was a multistep process that began with three subspecialist pathologists, who independently reviewed and classified liver slides from surgical resection. Unanimous agreement was achieved in 649 slides from 462 patients. These slides were then scanned to create whole-slide images, which were divided into sets for training (70%), validation (15%), and internal testing (15%). Accuracy, measured by area under the curve (AUC), was over 99.9% for the internal testing set. The accuracy of HnAIM was independently, externally validated.

First, HnAIM evaluated liver biopsy slides from 30 patients at one center. Results were compared with diagnoses made by nine pathologists classified as either senior, intermediate, or junior. While HnAIM correctly diagnosed 100% of the cases, senior pathologists correctly diagnosed 94.4% of the cases, followed in accuracy by intermediate (86.7%) and junior (73.3%) pathologists.

The researchers noted that the “rate of agreement with subspecialists was higher for HnAIM than for all 9 pathologists at distinguishing 7 liver tissues, with important diagnostic implications for fragmentary or scarce biopsy specimens.”

Next, HnAIM evaluated 234 samples from three hospitals. Accuracy was slightly lower, with an AUC of 93.5%. The researchers highlighted how HnAIM consistently differentiated precancerous lesions and well-defined HCC from benign lesions and background tissues.

A final experiment showed how HnAIM reacted to the most challenging cases. The investigators selected 12 cases without definitive diagnoses and found that, similar to the findings of three subspecialist pathologists, HnAIM did not reach a single diagnostic conclusion.

The researchers reported that “This may be due to a number of potential reasons, such as inherent uncertainty in the 2-dimensional interpretation of a 3-dimensional specimen, the limited number of tissue samples, and cognitive factors such as anchoring.”

However, HnAIM contributed to the diagnostic process by generating multiple diagnostic possibilities with weighted likelihood. After reviewing these results, the expert pathologists reached consensus in 5 out of 12 cases. Moreover, two out of three expert pathologists agreed on all 12 cases, improving agreement rate from 25% to 100%.

The researchers concluded that the model holds the promise to facilitate human HNL diagnoses and improve efficiency and quality. It can also reduce the workload of pathologists, especially where subspecialists are unavailable.

The study was supported by the National Natural Science Foundation of China, the Guangdong Basic and Applied Basic Research Foundation, the Natural Science Foundation of Guangdong Province, and others. The investigators reported no conflicts of interest.

A new deep learning system can classify hepatocellular nodular lesions (HNLs) via whole-slide images, improving risk stratification of patients and diagnostic rate of hepatocellular carcinoma (HCC), according to investigators.

While the model requires further validation, it could eventually be used to optimize accuracy and efficiency of histologic diagnoses, potentially decreasing reliance on pathologists, particularly in areas with limited access to subspecialists.

In an article published in Gastroenterology, Na Cheng, MD, of Sun Yat-sen University, Guangzhou, China, and colleagues wrote that the “diagnostic process [for HNLs] is laborious, time-consuming, and subject to the experience of the pathologists, often with significant interobserver and intraobserver variability. ... Therefore, [an] automated analysis system is highly demanded in the pathology field, which could considerably ease the workload, speed up the diagnosis, and facilitate the in-time treatment.”

To this end, Dr. Cheng and colleagues developed the hepatocellular-nodular artificial intelligence model (HnAIM) that can scan whole-image slides to identify seven types of tissue: well-differentiated HCC, high-grade dysplastic nodules, low-grade dysplastic nodules, hepatocellular adenoma, focal nodular hyperplasia, and background tissue.

Developing and testing HnAIM was a multistep process that began with three subspecialist pathologists, who independently reviewed and classified liver slides from surgical resection. Unanimous agreement was achieved in 649 slides from 462 patients. These slides were then scanned to create whole-slide images, which were divided into sets for training (70%), validation (15%), and internal testing (15%). Accuracy, measured by area under the curve (AUC), was over 99.9% for the internal testing set. The accuracy of HnAIM was independently, externally validated.

First, HnAIM evaluated liver biopsy slides from 30 patients at one center. Results were compared with diagnoses made by nine pathologists classified as either senior, intermediate, or junior. While HnAIM correctly diagnosed 100% of the cases, senior pathologists correctly diagnosed 94.4% of the cases, followed in accuracy by intermediate (86.7%) and junior (73.3%) pathologists.

The researchers noted that the “rate of agreement with subspecialists was higher for HnAIM than for all 9 pathologists at distinguishing 7 liver tissues, with important diagnostic implications for fragmentary or scarce biopsy specimens.”

Next, HnAIM evaluated 234 samples from three hospitals. Accuracy was slightly lower, with an AUC of 93.5%. The researchers highlighted how HnAIM consistently differentiated precancerous lesions and well-defined HCC from benign lesions and background tissues.

A final experiment showed how HnAIM reacted to the most challenging cases. The investigators selected 12 cases without definitive diagnoses and found that, similar to the findings of three subspecialist pathologists, HnAIM did not reach a single diagnostic conclusion.

The researchers reported that “This may be due to a number of potential reasons, such as inherent uncertainty in the 2-dimensional interpretation of a 3-dimensional specimen, the limited number of tissue samples, and cognitive factors such as anchoring.”

However, HnAIM contributed to the diagnostic process by generating multiple diagnostic possibilities with weighted likelihood. After reviewing these results, the expert pathologists reached consensus in 5 out of 12 cases. Moreover, two out of three expert pathologists agreed on all 12 cases, improving agreement rate from 25% to 100%.

The researchers concluded that the model holds the promise to facilitate human HNL diagnoses and improve efficiency and quality. It can also reduce the workload of pathologists, especially where subspecialists are unavailable.

The study was supported by the National Natural Science Foundation of China, the Guangdong Basic and Applied Basic Research Foundation, the Natural Science Foundation of Guangdong Province, and others. The investigators reported no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

New guideline sheds light on diagnosis, treatment of rare GI syndromes

Article Type
Changed
Fri, 04/29/2022 - 09:17

A clinical practice guideline for the diagnosis and management of gastrointestinal hamartomatous polyposis syndromes has just been published by the U.S. Multi-Society Task Force on Colorectal Cancer, which is comprised of experts representing the American College of Gastroenterology, the American Gastroenterological Association, and the American Society for Gastrointestinal Endoscopy.

Gastrointestinal hamartomatous polyposis syndromes are rare, autosomal dominant disorders associated with intestinal and extraintestinal tumors. Expert consensus statements have previously offered some recommendations for managing these syndromes, but clinical data are scarce, so the present review “is intended to establish a starting point for future research,” lead author C. Richard Boland, MD, of the University of California, San Diego, and colleagues reported.

According to the investigators, “there are essentially no long-term prospective controlled studies of comparative effectiveness of management strategies for these syndromes.” As a result, their recommendations are based on “low-quality” evidence according to GRADE criteria.

Still, Dr. Boland and colleagues highlighted that “there has been tremendous progress in recent years, both in understanding the underlying genetics that underpin these disorders and in elucidating the biology of associated premalignant and malignant conditions.”

The guideline was published online in Gastroenterology .
 

Four syndromes reviewed

The investigators gathered these data to provide an overview of genetic and clinical features for each syndrome, as well as management strategies. Four disorders are included: juvenile polyposis syndrome; Peutz-Jeghers syndrome; hereditary mixed polyposis syndrome; and PTEN-hamartoma tumor syndrome, encompassing Bannayan-Riley-Ruvalcaba syndrome and Cowden’s syndrome.

Although all gastrointestinal hamartomatous polyposis syndromes are caused by germline alterations, Dr. Boland and colleagues pointed out that diagnoses are typically made based on clinical criteria, with germline results serving as confirmatory evidence.

The guideline recommends that any patient with a family history of hamartomatous polyps, or with a history of at least two hamartomatous polyps, should undergo genetic testing. The guideline also provides more nuanced genetic testing algorithms for each syndrome.

Among all the hamartomatous polyp disorders, Peutz-Jeghers syndrome is most understood, according to the investigators. It is caused by aberrations in the STK11 gene, and is characterized by polyps with “branching bands of smooth muscle covered by hyperplastic glandular mucosa” that may occur in the stomach, small intestine, and colon. Patients are also at risk of extraintestinal neoplasia.

For management of Peutz-Jeghers syndrome, the guideline advises frequent endoscopic surveillance to prevent mechanical obstruction and bleeding, as well as multidisciplinary surveillance of the breasts, pancreas, ovaries, testes, and lungs.

Juvenile polyposis syndrome is most often characterized by solitary, sporadic polyps in the colorectum (98% of patients affected), followed distantly by polyps in the stomach (14%), ileum (7%), jejunum (7%), and duodenum (7%). The condition is linked with abnormalities in BMPR1A or SMAD4 genes, with SMAD4 germline abnormalities more often leading to “massive” gastric polyps, gastrointestinal bleeding, protein-losing enteropathy, and a higher incidence of gastric cancer in adulthood. Most patients with SMAD4 mutations also have hereditary hemorrhagic telangiectasia, characterized by gastrointestinal bleeding from mucocutaneous telangiectasias, arteriovenous malformations, and epistaxis.

Management of juvenile polyposis syndrome depends on frequent colonoscopies with polypectomies beginning at 12-15 years.

“The goal of surveillance in juvenile polyposis syndrome is to mitigate symptoms related to the disorder and decrease the risk of complications from the manifestations, including cancer,” Dr. Boland and colleagues wrote.

PTEN-hamartoma tumor syndrome, which includes both Bannayan-Riley-Ruvalcaba syndrome and Cowden’s syndrome, is caused by abnormalities in the eponymous PTEN gene. Patients with the condition have an increased risk of colon cancer and polyposis, as well as extraintestinal cancers.

Diagnosis of PTEN-hamartoma tumor syndrome may be complex, involving “clinical examination, mammography and breast MRI, thyroid ultrasound, transvaginal ultrasound, upper gastrointestinal endoscopy, colonoscopy, and renal ultrasound,” according to the guideline.

After diagnosis, frequent colonoscopies are recommended, typically starting at age 35 years, as well as continued surveillance of other organs.

Hereditary mixed polyposis syndrome, which involves attenuated colonic polyposis, is the rarest of the four disorders, having been reported in only “a few families,” according to the guideline. The condition has been linked with “large duplications of the promoter region or entire GREM1 gene.”

Onset is typically in the late 20s, “which is when colonoscopic surveillance should begin,” the investigators wrote. More data are needed to determine appropriate surveillance intervals and if the condition is associated with increased risk of extraintestinal neoplasia.

This call for more research into gastrointestinal hamartomatous polyposis syndromes carried through to the conclusion of the guideline.

“Long-term prospective studies of mutation carriers are still needed to further clarify the risk of cancer and the role of surveillance in these syndromes,” Dr. Boland and colleagues wrote. “With increases in genetic testing and evaluation, future studies will be conducted with more robust cohorts of genetically characterized, less heterogeneous populations. However, there is also a need to study patients and families with unusual phenotypes where no genotype can be found.”

The investigators disclosed no conflicts of interest with the current guideline; however, they provided a list of industry relationships, including Salix Pharmaceuticals, Ferring Pharmaceuticals, and Pfizer, among others.

Publications
Topics
Sections

A clinical practice guideline for the diagnosis and management of gastrointestinal hamartomatous polyposis syndromes has just been published by the U.S. Multi-Society Task Force on Colorectal Cancer, which is comprised of experts representing the American College of Gastroenterology, the American Gastroenterological Association, and the American Society for Gastrointestinal Endoscopy.

Gastrointestinal hamartomatous polyposis syndromes are rare, autosomal dominant disorders associated with intestinal and extraintestinal tumors. Expert consensus statements have previously offered some recommendations for managing these syndromes, but clinical data are scarce, so the present review “is intended to establish a starting point for future research,” lead author C. Richard Boland, MD, of the University of California, San Diego, and colleagues reported.

According to the investigators, “there are essentially no long-term prospective controlled studies of comparative effectiveness of management strategies for these syndromes.” As a result, their recommendations are based on “low-quality” evidence according to GRADE criteria.

Still, Dr. Boland and colleagues highlighted that “there has been tremendous progress in recent years, both in understanding the underlying genetics that underpin these disorders and in elucidating the biology of associated premalignant and malignant conditions.”

The guideline was published online in Gastroenterology .
 

Four syndromes reviewed

The investigators gathered these data to provide an overview of genetic and clinical features for each syndrome, as well as management strategies. Four disorders are included: juvenile polyposis syndrome; Peutz-Jeghers syndrome; hereditary mixed polyposis syndrome; and PTEN-hamartoma tumor syndrome, encompassing Bannayan-Riley-Ruvalcaba syndrome and Cowden’s syndrome.

Although all gastrointestinal hamartomatous polyposis syndromes are caused by germline alterations, Dr. Boland and colleagues pointed out that diagnoses are typically made based on clinical criteria, with germline results serving as confirmatory evidence.

The guideline recommends that any patient with a family history of hamartomatous polyps, or with a history of at least two hamartomatous polyps, should undergo genetic testing. The guideline also provides more nuanced genetic testing algorithms for each syndrome.

Among all the hamartomatous polyp disorders, Peutz-Jeghers syndrome is most understood, according to the investigators. It is caused by aberrations in the STK11 gene, and is characterized by polyps with “branching bands of smooth muscle covered by hyperplastic glandular mucosa” that may occur in the stomach, small intestine, and colon. Patients are also at risk of extraintestinal neoplasia.

For management of Peutz-Jeghers syndrome, the guideline advises frequent endoscopic surveillance to prevent mechanical obstruction and bleeding, as well as multidisciplinary surveillance of the breasts, pancreas, ovaries, testes, and lungs.

Juvenile polyposis syndrome is most often characterized by solitary, sporadic polyps in the colorectum (98% of patients affected), followed distantly by polyps in the stomach (14%), ileum (7%), jejunum (7%), and duodenum (7%). The condition is linked with abnormalities in BMPR1A or SMAD4 genes, with SMAD4 germline abnormalities more often leading to “massive” gastric polyps, gastrointestinal bleeding, protein-losing enteropathy, and a higher incidence of gastric cancer in adulthood. Most patients with SMAD4 mutations also have hereditary hemorrhagic telangiectasia, characterized by gastrointestinal bleeding from mucocutaneous telangiectasias, arteriovenous malformations, and epistaxis.

Management of juvenile polyposis syndrome depends on frequent colonoscopies with polypectomies beginning at 12-15 years.

“The goal of surveillance in juvenile polyposis syndrome is to mitigate symptoms related to the disorder and decrease the risk of complications from the manifestations, including cancer,” Dr. Boland and colleagues wrote.

PTEN-hamartoma tumor syndrome, which includes both Bannayan-Riley-Ruvalcaba syndrome and Cowden’s syndrome, is caused by abnormalities in the eponymous PTEN gene. Patients with the condition have an increased risk of colon cancer and polyposis, as well as extraintestinal cancers.

Diagnosis of PTEN-hamartoma tumor syndrome may be complex, involving “clinical examination, mammography and breast MRI, thyroid ultrasound, transvaginal ultrasound, upper gastrointestinal endoscopy, colonoscopy, and renal ultrasound,” according to the guideline.

After diagnosis, frequent colonoscopies are recommended, typically starting at age 35 years, as well as continued surveillance of other organs.

Hereditary mixed polyposis syndrome, which involves attenuated colonic polyposis, is the rarest of the four disorders, having been reported in only “a few families,” according to the guideline. The condition has been linked with “large duplications of the promoter region or entire GREM1 gene.”

Onset is typically in the late 20s, “which is when colonoscopic surveillance should begin,” the investigators wrote. More data are needed to determine appropriate surveillance intervals and if the condition is associated with increased risk of extraintestinal neoplasia.

This call for more research into gastrointestinal hamartomatous polyposis syndromes carried through to the conclusion of the guideline.

“Long-term prospective studies of mutation carriers are still needed to further clarify the risk of cancer and the role of surveillance in these syndromes,” Dr. Boland and colleagues wrote. “With increases in genetic testing and evaluation, future studies will be conducted with more robust cohorts of genetically characterized, less heterogeneous populations. However, there is also a need to study patients and families with unusual phenotypes where no genotype can be found.”

The investigators disclosed no conflicts of interest with the current guideline; however, they provided a list of industry relationships, including Salix Pharmaceuticals, Ferring Pharmaceuticals, and Pfizer, among others.

A clinical practice guideline for the diagnosis and management of gastrointestinal hamartomatous polyposis syndromes has just been published by the U.S. Multi-Society Task Force on Colorectal Cancer, which is comprised of experts representing the American College of Gastroenterology, the American Gastroenterological Association, and the American Society for Gastrointestinal Endoscopy.

Gastrointestinal hamartomatous polyposis syndromes are rare, autosomal dominant disorders associated with intestinal and extraintestinal tumors. Expert consensus statements have previously offered some recommendations for managing these syndromes, but clinical data are scarce, so the present review “is intended to establish a starting point for future research,” lead author C. Richard Boland, MD, of the University of California, San Diego, and colleagues reported.

According to the investigators, “there are essentially no long-term prospective controlled studies of comparative effectiveness of management strategies for these syndromes.” As a result, their recommendations are based on “low-quality” evidence according to GRADE criteria.

Still, Dr. Boland and colleagues highlighted that “there has been tremendous progress in recent years, both in understanding the underlying genetics that underpin these disorders and in elucidating the biology of associated premalignant and malignant conditions.”

The guideline was published online in Gastroenterology .
 

Four syndromes reviewed

The investigators gathered these data to provide an overview of genetic and clinical features for each syndrome, as well as management strategies. Four disorders are included: juvenile polyposis syndrome; Peutz-Jeghers syndrome; hereditary mixed polyposis syndrome; and PTEN-hamartoma tumor syndrome, encompassing Bannayan-Riley-Ruvalcaba syndrome and Cowden’s syndrome.

Although all gastrointestinal hamartomatous polyposis syndromes are caused by germline alterations, Dr. Boland and colleagues pointed out that diagnoses are typically made based on clinical criteria, with germline results serving as confirmatory evidence.

The guideline recommends that any patient with a family history of hamartomatous polyps, or with a history of at least two hamartomatous polyps, should undergo genetic testing. The guideline also provides more nuanced genetic testing algorithms for each syndrome.

Among all the hamartomatous polyp disorders, Peutz-Jeghers syndrome is most understood, according to the investigators. It is caused by aberrations in the STK11 gene, and is characterized by polyps with “branching bands of smooth muscle covered by hyperplastic glandular mucosa” that may occur in the stomach, small intestine, and colon. Patients are also at risk of extraintestinal neoplasia.

For management of Peutz-Jeghers syndrome, the guideline advises frequent endoscopic surveillance to prevent mechanical obstruction and bleeding, as well as multidisciplinary surveillance of the breasts, pancreas, ovaries, testes, and lungs.

Juvenile polyposis syndrome is most often characterized by solitary, sporadic polyps in the colorectum (98% of patients affected), followed distantly by polyps in the stomach (14%), ileum (7%), jejunum (7%), and duodenum (7%). The condition is linked with abnormalities in BMPR1A or SMAD4 genes, with SMAD4 germline abnormalities more often leading to “massive” gastric polyps, gastrointestinal bleeding, protein-losing enteropathy, and a higher incidence of gastric cancer in adulthood. Most patients with SMAD4 mutations also have hereditary hemorrhagic telangiectasia, characterized by gastrointestinal bleeding from mucocutaneous telangiectasias, arteriovenous malformations, and epistaxis.

Management of juvenile polyposis syndrome depends on frequent colonoscopies with polypectomies beginning at 12-15 years.

“The goal of surveillance in juvenile polyposis syndrome is to mitigate symptoms related to the disorder and decrease the risk of complications from the manifestations, including cancer,” Dr. Boland and colleagues wrote.

PTEN-hamartoma tumor syndrome, which includes both Bannayan-Riley-Ruvalcaba syndrome and Cowden’s syndrome, is caused by abnormalities in the eponymous PTEN gene. Patients with the condition have an increased risk of colon cancer and polyposis, as well as extraintestinal cancers.

Diagnosis of PTEN-hamartoma tumor syndrome may be complex, involving “clinical examination, mammography and breast MRI, thyroid ultrasound, transvaginal ultrasound, upper gastrointestinal endoscopy, colonoscopy, and renal ultrasound,” according to the guideline.

After diagnosis, frequent colonoscopies are recommended, typically starting at age 35 years, as well as continued surveillance of other organs.

Hereditary mixed polyposis syndrome, which involves attenuated colonic polyposis, is the rarest of the four disorders, having been reported in only “a few families,” according to the guideline. The condition has been linked with “large duplications of the promoter region or entire GREM1 gene.”

Onset is typically in the late 20s, “which is when colonoscopic surveillance should begin,” the investigators wrote. More data are needed to determine appropriate surveillance intervals and if the condition is associated with increased risk of extraintestinal neoplasia.

This call for more research into gastrointestinal hamartomatous polyposis syndromes carried through to the conclusion of the guideline.

“Long-term prospective studies of mutation carriers are still needed to further clarify the risk of cancer and the role of surveillance in these syndromes,” Dr. Boland and colleagues wrote. “With increases in genetic testing and evaluation, future studies will be conducted with more robust cohorts of genetically characterized, less heterogeneous populations. However, there is also a need to study patients and families with unusual phenotypes where no genotype can be found.”

The investigators disclosed no conflicts of interest with the current guideline; however, they provided a list of industry relationships, including Salix Pharmaceuticals, Ferring Pharmaceuticals, and Pfizer, among others.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Second-trimester blood test predicts preterm birth

Article Type
Changed
Tue, 04/26/2022 - 16:50

A new blood test performed in the second trimester could help identify pregnancies at risk of early and very early spontaneous preterm birth (sPTB), based on a prospective cohort trial.

The cell-free RNA (cfRNA) profiling tool could guide patient and provider decision-making, while the underlying research illuminates biological pathways that may facilitate novel interventions, reported lead author Joan Camunas-Soler, PhD, of Mirvie, South San Francisco, and colleagues.

“Given the complex etiology of this heterogeneous syndrome, it would be advantageous to develop predictive tests that provide insight on the specific pathophysiology leading to preterm birth for each particular pregnancy,” Dr. Camunas-Soler and colleagues wrote in the American Journal of Obstetrics and Gynecology. “Such an approach could inform the development of preventive treatments and targeted therapeutics that are currently lacking/difficult to implement due to the heterogeneous etiology of sPTB.”

Currently, the best predictor of sPTB is previous sPTB, according to the investigators. Although a combination approach that incorporates cervical length and fetal fibronectin in cervicovaginal fluid is “of use,” they noted, “this is not standard of care in the U.S.A. nor recommended by the American College of Obstetricians and Gynecologists or the Society for Maternal-Fetal Medicine.” Existing molecular tests lack clinical data and may be inaccurate across diverse patient populations, they added.

The present study aimed to address these shortcomings by creating a second-trimester blood test for predicting sPTB. To identify relevant biomarkers, the investigators compared RNA profiles that were differentially expressed in three types of cases: term birth, early sPTB, and very early sPTB.

Among 242 women who contributed second-trimester blood samples for analysis, 194 went on to have a term birth. Of the remaining 48 women who gave birth spontaneously before 35 weeks’ gestation, 32 delivered between 25 and 35 weeks (early sPTB), while 16 delivered before 25 weeks’ gestation (very early sPTB). Slightly more than half of the patients were White, about one-third were Black, approximately 10% were Asian, and the remainder were of unknown race/ethnicity. Cases of preeclampsia were excluded.

The gene discovery and modeling process revealed 25 distinct genes that were significantly associated with early sPTB, offering a risk model with a sensitivity of 76% and a specificity of 72% (area under the curve, 0.80; 95% confidence interval, 0.72-0.87). Very early sPTB was associated with a set of 39 genes, giving a model with a sensitivity of 64% and a specificity of 80% (area under the curve = 0.76; 95% CI, 0.63-0.87).

Characterization of the two RNA profiles offered a glimpse into the underlying biological processes driving preterm birth. The genes predicting early sPTB are largely responsible for extracellular matrix degradation and remodeling, which could, “in terms of mechanism, reflect ongoing processes associated with cervical shortening, a feature often detected some weeks prior to sPTB,” the investigators wrote. In contrast, genes associated with very early sPTB are linked with insulinlike growth factor transport, which drives fetal growth and placentation. These findings could lead to development of pathway-specific interventions, Dr. Camunas-Soler and colleagues suggested.

According to coauthor Michal A. Elovitz, MD, the Hilarie L. Morgan and Mitchell L. Morgan President’s Distinguished Professor in Women’s Health at the University of Pennsylvania, Philadelphia, and chief medical advisor at Mirvie, the proprietary RNA platform moves beyond “unreliable and at times biased clinical factors such as race, BMI, and maternal age” to offer a “precision-based approach to pregnancy health.”

Excluding traditional risk factors also “promises more equitable care than the use of broad sociodemographic factors that often result in bias,” she added, noting that this may help address the higher rate of pregnancy complications among Black patients.

When asked about the potential for false-positive results, considering reported specificity rates of 72%-80%, Dr. Elovitz suggested that such concerns among pregnant women are an “unfortunate misconception.”

“It is not reflective of what women want regarding knowledge about the health of their pregnancy,” she said in a written comment. “Rather than be left in the dark, women want to be prepared for what is to come in their pregnancy journey.”

In support of this statement, Dr. Elovitz cited a recent study involving women with preeclampsia and other hypertensive disorders in pregnancy. A questionnaire showed that women appreciated pregnancy risk models when making decisions, and reported that they would have greater peace of mind if such tests were available.

Dr. Laura Jelliffe-Pawlowski


Laura Jelliffe-Pawlowski, PhD, of the University of California, San Francisco, California Preterm Birth Initiative, supported Dr. Elovitz’s viewpoint.

“If you talk to women who have delivered preterm most (but not all) say that they would have wanted to know their risk so they could have been better prepared,” she said in a written comment. “I think we need to shift the narrative to empowerment away from fear.”

Dr. Jelliffe-Pawlowski, who holds a patent for a separate test predicting preterm birth, said that the Mirvie RNA platform is “promising,” although she expressed concern that excluding patients with preeclampsia – representing approximately 4% of pregnancies in the United States – may have clouded accuracy results.

“What is unclear is how the test would perform more generally when a sample of all pregnancies was included,” she said. “Without that information, it is hard to compare their findings with other predictive models without such exclusions.”

Regardless of the model used, Dr. Jelliffe-Pawlowski said that more research is needed to determine best clinical responses when risk of sPTB is increased.

“Ultimately we want to connect action with results,” she said. “Okay, so [a woman] is at high risk for delivering preterm – now what? There is a lot of untapped potential once you start to focus more with women and birthing people you know have a high likelihood of preterm birth.”

The study was supported by Mirvie, Tommy’s Charity, and the National Institute for Health Research Biomedical Research Centre. The investigators disclosed financial relationships with Mirvie, including equity interest and/or intellectual property rights. Cohort contributors were remunerated for sample collection and/or shipping. Dr. Jelliffe-Pawlowski holds a patent for a different preterm birth prediction blood test.

*This story was updated on 4/26/2022. 

Publications
Topics
Sections

A new blood test performed in the second trimester could help identify pregnancies at risk of early and very early spontaneous preterm birth (sPTB), based on a prospective cohort trial.

The cell-free RNA (cfRNA) profiling tool could guide patient and provider decision-making, while the underlying research illuminates biological pathways that may facilitate novel interventions, reported lead author Joan Camunas-Soler, PhD, of Mirvie, South San Francisco, and colleagues.

“Given the complex etiology of this heterogeneous syndrome, it would be advantageous to develop predictive tests that provide insight on the specific pathophysiology leading to preterm birth for each particular pregnancy,” Dr. Camunas-Soler and colleagues wrote in the American Journal of Obstetrics and Gynecology. “Such an approach could inform the development of preventive treatments and targeted therapeutics that are currently lacking/difficult to implement due to the heterogeneous etiology of sPTB.”

Currently, the best predictor of sPTB is previous sPTB, according to the investigators. Although a combination approach that incorporates cervical length and fetal fibronectin in cervicovaginal fluid is “of use,” they noted, “this is not standard of care in the U.S.A. nor recommended by the American College of Obstetricians and Gynecologists or the Society for Maternal-Fetal Medicine.” Existing molecular tests lack clinical data and may be inaccurate across diverse patient populations, they added.

The present study aimed to address these shortcomings by creating a second-trimester blood test for predicting sPTB. To identify relevant biomarkers, the investigators compared RNA profiles that were differentially expressed in three types of cases: term birth, early sPTB, and very early sPTB.

Among 242 women who contributed second-trimester blood samples for analysis, 194 went on to have a term birth. Of the remaining 48 women who gave birth spontaneously before 35 weeks’ gestation, 32 delivered between 25 and 35 weeks (early sPTB), while 16 delivered before 25 weeks’ gestation (very early sPTB). Slightly more than half of the patients were White, about one-third were Black, approximately 10% were Asian, and the remainder were of unknown race/ethnicity. Cases of preeclampsia were excluded.

The gene discovery and modeling process revealed 25 distinct genes that were significantly associated with early sPTB, offering a risk model with a sensitivity of 76% and a specificity of 72% (area under the curve, 0.80; 95% confidence interval, 0.72-0.87). Very early sPTB was associated with a set of 39 genes, giving a model with a sensitivity of 64% and a specificity of 80% (area under the curve = 0.76; 95% CI, 0.63-0.87).

Characterization of the two RNA profiles offered a glimpse into the underlying biological processes driving preterm birth. The genes predicting early sPTB are largely responsible for extracellular matrix degradation and remodeling, which could, “in terms of mechanism, reflect ongoing processes associated with cervical shortening, a feature often detected some weeks prior to sPTB,” the investigators wrote. In contrast, genes associated with very early sPTB are linked with insulinlike growth factor transport, which drives fetal growth and placentation. These findings could lead to development of pathway-specific interventions, Dr. Camunas-Soler and colleagues suggested.

According to coauthor Michal A. Elovitz, MD, the Hilarie L. Morgan and Mitchell L. Morgan President’s Distinguished Professor in Women’s Health at the University of Pennsylvania, Philadelphia, and chief medical advisor at Mirvie, the proprietary RNA platform moves beyond “unreliable and at times biased clinical factors such as race, BMI, and maternal age” to offer a “precision-based approach to pregnancy health.”

Excluding traditional risk factors also “promises more equitable care than the use of broad sociodemographic factors that often result in bias,” she added, noting that this may help address the higher rate of pregnancy complications among Black patients.

When asked about the potential for false-positive results, considering reported specificity rates of 72%-80%, Dr. Elovitz suggested that such concerns among pregnant women are an “unfortunate misconception.”

“It is not reflective of what women want regarding knowledge about the health of their pregnancy,” she said in a written comment. “Rather than be left in the dark, women want to be prepared for what is to come in their pregnancy journey.”

In support of this statement, Dr. Elovitz cited a recent study involving women with preeclampsia and other hypertensive disorders in pregnancy. A questionnaire showed that women appreciated pregnancy risk models when making decisions, and reported that they would have greater peace of mind if such tests were available.

Dr. Laura Jelliffe-Pawlowski


Laura Jelliffe-Pawlowski, PhD, of the University of California, San Francisco, California Preterm Birth Initiative, supported Dr. Elovitz’s viewpoint.

“If you talk to women who have delivered preterm most (but not all) say that they would have wanted to know their risk so they could have been better prepared,” she said in a written comment. “I think we need to shift the narrative to empowerment away from fear.”

Dr. Jelliffe-Pawlowski, who holds a patent for a separate test predicting preterm birth, said that the Mirvie RNA platform is “promising,” although she expressed concern that excluding patients with preeclampsia – representing approximately 4% of pregnancies in the United States – may have clouded accuracy results.

“What is unclear is how the test would perform more generally when a sample of all pregnancies was included,” she said. “Without that information, it is hard to compare their findings with other predictive models without such exclusions.”

Regardless of the model used, Dr. Jelliffe-Pawlowski said that more research is needed to determine best clinical responses when risk of sPTB is increased.

“Ultimately we want to connect action with results,” she said. “Okay, so [a woman] is at high risk for delivering preterm – now what? There is a lot of untapped potential once you start to focus more with women and birthing people you know have a high likelihood of preterm birth.”

The study was supported by Mirvie, Tommy’s Charity, and the National Institute for Health Research Biomedical Research Centre. The investigators disclosed financial relationships with Mirvie, including equity interest and/or intellectual property rights. Cohort contributors were remunerated for sample collection and/or shipping. Dr. Jelliffe-Pawlowski holds a patent for a different preterm birth prediction blood test.

*This story was updated on 4/26/2022. 

A new blood test performed in the second trimester could help identify pregnancies at risk of early and very early spontaneous preterm birth (sPTB), based on a prospective cohort trial.

The cell-free RNA (cfRNA) profiling tool could guide patient and provider decision-making, while the underlying research illuminates biological pathways that may facilitate novel interventions, reported lead author Joan Camunas-Soler, PhD, of Mirvie, South San Francisco, and colleagues.

“Given the complex etiology of this heterogeneous syndrome, it would be advantageous to develop predictive tests that provide insight on the specific pathophysiology leading to preterm birth for each particular pregnancy,” Dr. Camunas-Soler and colleagues wrote in the American Journal of Obstetrics and Gynecology. “Such an approach could inform the development of preventive treatments and targeted therapeutics that are currently lacking/difficult to implement due to the heterogeneous etiology of sPTB.”

Currently, the best predictor of sPTB is previous sPTB, according to the investigators. Although a combination approach that incorporates cervical length and fetal fibronectin in cervicovaginal fluid is “of use,” they noted, “this is not standard of care in the U.S.A. nor recommended by the American College of Obstetricians and Gynecologists or the Society for Maternal-Fetal Medicine.” Existing molecular tests lack clinical data and may be inaccurate across diverse patient populations, they added.

The present study aimed to address these shortcomings by creating a second-trimester blood test for predicting sPTB. To identify relevant biomarkers, the investigators compared RNA profiles that were differentially expressed in three types of cases: term birth, early sPTB, and very early sPTB.

Among 242 women who contributed second-trimester blood samples for analysis, 194 went on to have a term birth. Of the remaining 48 women who gave birth spontaneously before 35 weeks’ gestation, 32 delivered between 25 and 35 weeks (early sPTB), while 16 delivered before 25 weeks’ gestation (very early sPTB). Slightly more than half of the patients were White, about one-third were Black, approximately 10% were Asian, and the remainder were of unknown race/ethnicity. Cases of preeclampsia were excluded.

The gene discovery and modeling process revealed 25 distinct genes that were significantly associated with early sPTB, offering a risk model with a sensitivity of 76% and a specificity of 72% (area under the curve, 0.80; 95% confidence interval, 0.72-0.87). Very early sPTB was associated with a set of 39 genes, giving a model with a sensitivity of 64% and a specificity of 80% (area under the curve = 0.76; 95% CI, 0.63-0.87).

Characterization of the two RNA profiles offered a glimpse into the underlying biological processes driving preterm birth. The genes predicting early sPTB are largely responsible for extracellular matrix degradation and remodeling, which could, “in terms of mechanism, reflect ongoing processes associated with cervical shortening, a feature often detected some weeks prior to sPTB,” the investigators wrote. In contrast, genes associated with very early sPTB are linked with insulinlike growth factor transport, which drives fetal growth and placentation. These findings could lead to development of pathway-specific interventions, Dr. Camunas-Soler and colleagues suggested.

According to coauthor Michal A. Elovitz, MD, the Hilarie L. Morgan and Mitchell L. Morgan President’s Distinguished Professor in Women’s Health at the University of Pennsylvania, Philadelphia, and chief medical advisor at Mirvie, the proprietary RNA platform moves beyond “unreliable and at times biased clinical factors such as race, BMI, and maternal age” to offer a “precision-based approach to pregnancy health.”

Excluding traditional risk factors also “promises more equitable care than the use of broad sociodemographic factors that often result in bias,” she added, noting that this may help address the higher rate of pregnancy complications among Black patients.

When asked about the potential for false-positive results, considering reported specificity rates of 72%-80%, Dr. Elovitz suggested that such concerns among pregnant women are an “unfortunate misconception.”

“It is not reflective of what women want regarding knowledge about the health of their pregnancy,” she said in a written comment. “Rather than be left in the dark, women want to be prepared for what is to come in their pregnancy journey.”

In support of this statement, Dr. Elovitz cited a recent study involving women with preeclampsia and other hypertensive disorders in pregnancy. A questionnaire showed that women appreciated pregnancy risk models when making decisions, and reported that they would have greater peace of mind if such tests were available.

Dr. Laura Jelliffe-Pawlowski


Laura Jelliffe-Pawlowski, PhD, of the University of California, San Francisco, California Preterm Birth Initiative, supported Dr. Elovitz’s viewpoint.

“If you talk to women who have delivered preterm most (but not all) say that they would have wanted to know their risk so they could have been better prepared,” she said in a written comment. “I think we need to shift the narrative to empowerment away from fear.”

Dr. Jelliffe-Pawlowski, who holds a patent for a separate test predicting preterm birth, said that the Mirvie RNA platform is “promising,” although she expressed concern that excluding patients with preeclampsia – representing approximately 4% of pregnancies in the United States – may have clouded accuracy results.

“What is unclear is how the test would perform more generally when a sample of all pregnancies was included,” she said. “Without that information, it is hard to compare their findings with other predictive models without such exclusions.”

Regardless of the model used, Dr. Jelliffe-Pawlowski said that more research is needed to determine best clinical responses when risk of sPTB is increased.

“Ultimately we want to connect action with results,” she said. “Okay, so [a woman] is at high risk for delivering preterm – now what? There is a lot of untapped potential once you start to focus more with women and birthing people you know have a high likelihood of preterm birth.”

The study was supported by Mirvie, Tommy’s Charity, and the National Institute for Health Research Biomedical Research Centre. The investigators disclosed financial relationships with Mirvie, including equity interest and/or intellectual property rights. Cohort contributors were remunerated for sample collection and/or shipping. Dr. Jelliffe-Pawlowski holds a patent for a different preterm birth prediction blood test.

*This story was updated on 4/26/2022. 

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM AMERICAN JOURNAL OF OBSTETRICS AND GYNECOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Study: Fasting plus calorie counting offered no weight-loss benefit over calorie counting alone

Article Type
Changed
Fri, 04/22/2022 - 07:49

 

Not so fast! Daily fasting with calorie restriction may not lead to shedding more pounds than just cutting back on calories, according to the authors of a new study.

Over the course of a year, study participants who ate only from 8:00 a.m. to 4:00 p.m. did not lose significantly more weight than individuals who ate whenever they wanted, nor did they achieve significantly greater improvements in other obesity-related health measures like body mass index (BMI) or metabolic risk, reported lead author Deying Liu, MD, of Nanfang Hospital, Southern Medical University, Guangzhou, China, and colleagues.

“[Daily fasting] has gained popularity because it is a weight-loss strategy that is simple to follow, which may enhance adherence,” Dr. Liu and colleagues wrote in the New England Journal of Medicine. However, “the long-term efficacy and safety of time-restricted eating as a weight-loss strategy are still uncertain, and the long-term effects on weight loss of time-restricted eating as compared with daily calorie restriction alone have not been fully explored.”

To learn more, Dr. Liu and colleagues recruited 139 adult patients with BMIs between 28 and 45. Individuals with serious medical conditions, such as malignant tumors, diabetes, chronic kidney disease, and others were excluded. Other exclusion criteria included smoking, ongoing participation in a weight-loss program, GI surgery within the prior year, use of medications that impact energy balance and weight, and planned or current pregnancy.

All participants were advised to eat calorie-restricted diets, with ranges of 1,500-1,800 kcal per day for men and 1,200-1,500 kcal per day for women. To determine the added impact of fasting, participants were randomized in a 1:1 ratio into time-restricted (fasting) or non–time-restricted (nonfasting) groups, in which fasting participants ate only during an 8-hour window from 8:00 a.m. to 4:00 p.m., whereas nonfasting participants ate whenever they wanted.

At 6 months and 12 months, participants were re-evaluated for changes in weight, body fat, BMI, blood pressure, lean body mass, and metabolic risk factors, including glucose level, triglycerides, blood pressure, and others.
 

Caloric intake restriction seems to explain most of beneficial effects

At one-year follow-up, 118 participants (84.9%) remained in the study. Although members of the fasting group lost slightly more weight on average than those in the non-fasting group (mean, 8.0 kg vs. 6.3 kg), the difference between groups was not statistically significant (95% confidence interval, −4.0 to 0.4; P = .11).

Most of the other obesity-related health measures also trended toward favoring the fasting group, but again, none of these improvements was statistically significant. Weight circumference at 1 year, for example, decreased by a mean of 9.4 cm in the fasting group versus 8.8 cm in the nonfasting group, a net difference of 1.8 cm (95% CI, –4.0 to 0.5).

“We found that the two weight-loss regimens that we evaluated had similar success in patients with obesity, regardless of whether they reduced their calorie consumption through time-restricted eating or through calorie restriction alone,” Dr. Liu and colleagues concluded.

Principal investigator Huijie Zhang MD, PhD, professor, chief physician, and deputy director of the department of endocrinology and metabolism at Nafang Hospital, noted that their findings are “consistent with the findings in previous studies.”

“Our data suggest that caloric intake restriction explained most of the beneficial effects of a time-restricted eating regimen,” Dr. Zhang said.

Still, Dr. Zhang called time-restricted eating “a viable and sustainable approach for a person who wants to lose weight.”

More work is needed, Dr. Zhang said, to uncover the impact of fasting in “diverse groups,” including patients with chronic disease like diabetes and cardiovascular disease. Investigators should also conduct studies to compare outcomes between men and women, and evaluate the effects of other fasting durations.
 

 

 

Can trial be applied to a wider population?

According to Blandine Laferrère, MD, PhD, and Satchidananda Panda, PhD, of Columbia University Irving Medical Center, New York, and the Salk Institute for Biological Studies, La Jolla, Calif., respectively, “the results of the trial suggest that calorie restriction combined with time restriction, when delivered with intensive coaching and monitoring, is an approach that is as safe, sustainable, and effective for weight loss as calorie restriction alone.”

Yet Dr. Laferrère and Dr. Panda also expressed skepticism about broader implementation of a similar regime.

“The applicability of this trial to wider populations is debatable,” they wrote in an accompanying editorial. “The short time period for eating at baseline may be specific to the population studied, since investigators outside China have reported longer time windows. The rigorous coaching and monitoring by trial staff also leaves open the question of whether time-restricted eating is easier to adhere to than intentional calorie restriction. Such cost-benefit analyses are important for the assessment of the scalability of a lifestyle intervention.”
 

Duration is trial’s greatest strength

Kristina Varady, PhD, professor of nutrition in the department of kinesiology and nutrition at the University of Illinois at Chicago, said the “key strength” of the trial was its duration, at 12 months, making it the longest time-restricted eating trial to date”; however, she was critical of the design.

Dr. Kristina Varady

“Quite frankly, I’m surprised this study got into such a high-caliber medical journal,” Dr. Varady said in a written comment. “It doesn’t even have a control group! It goes to show how popular these diets are and how much people want to know about them.”

She also noted that “the study was flawed in that it didn’t really look at the effects of true time-restricted eating.” According to Dr. Varady, combining calorie restriction with time-restricted eating “kind of defeats the purpose” of a time-restricted diet.

“The main benefit of time-restricted eating is that you don’t need to count calories in order to lose weight,” Dr. Varady said, citing two of her own studies from 2018 and 2020. “Just by limiting the eating window to 8 hours per day, people naturally cut out 300-500 calories per day. That’s why people like [time-restricted eating] so much.”

Dr. Varady was also “very surprised” at the adherence data. At 1 year, approximately 85% of the patients were still following the protocol, a notably higher rate than most dietary intervention studies, which typically report adherence rates of 50-60%, she said. The high adherence rate was particularly unexpected because of the 8:00 a.m.–4:00 p.m. eating window, Dr. Varady added, since that meant skipping “the family/social meal every evening over 1 whole year!”

The study was funded by the National Key Research and Development Project and others. The study investigators reported no conflicts of interest. Dr. Varady disclosed author fees from the Hachette Book group for her book “The Every Other Day Diet.”

Publications
Topics
Sections

 

Not so fast! Daily fasting with calorie restriction may not lead to shedding more pounds than just cutting back on calories, according to the authors of a new study.

Over the course of a year, study participants who ate only from 8:00 a.m. to 4:00 p.m. did not lose significantly more weight than individuals who ate whenever they wanted, nor did they achieve significantly greater improvements in other obesity-related health measures like body mass index (BMI) or metabolic risk, reported lead author Deying Liu, MD, of Nanfang Hospital, Southern Medical University, Guangzhou, China, and colleagues.

“[Daily fasting] has gained popularity because it is a weight-loss strategy that is simple to follow, which may enhance adherence,” Dr. Liu and colleagues wrote in the New England Journal of Medicine. However, “the long-term efficacy and safety of time-restricted eating as a weight-loss strategy are still uncertain, and the long-term effects on weight loss of time-restricted eating as compared with daily calorie restriction alone have not been fully explored.”

To learn more, Dr. Liu and colleagues recruited 139 adult patients with BMIs between 28 and 45. Individuals with serious medical conditions, such as malignant tumors, diabetes, chronic kidney disease, and others were excluded. Other exclusion criteria included smoking, ongoing participation in a weight-loss program, GI surgery within the prior year, use of medications that impact energy balance and weight, and planned or current pregnancy.

All participants were advised to eat calorie-restricted diets, with ranges of 1,500-1,800 kcal per day for men and 1,200-1,500 kcal per day for women. To determine the added impact of fasting, participants were randomized in a 1:1 ratio into time-restricted (fasting) or non–time-restricted (nonfasting) groups, in which fasting participants ate only during an 8-hour window from 8:00 a.m. to 4:00 p.m., whereas nonfasting participants ate whenever they wanted.

At 6 months and 12 months, participants were re-evaluated for changes in weight, body fat, BMI, blood pressure, lean body mass, and metabolic risk factors, including glucose level, triglycerides, blood pressure, and others.
 

Caloric intake restriction seems to explain most of beneficial effects

At one-year follow-up, 118 participants (84.9%) remained in the study. Although members of the fasting group lost slightly more weight on average than those in the non-fasting group (mean, 8.0 kg vs. 6.3 kg), the difference between groups was not statistically significant (95% confidence interval, −4.0 to 0.4; P = .11).

Most of the other obesity-related health measures also trended toward favoring the fasting group, but again, none of these improvements was statistically significant. Weight circumference at 1 year, for example, decreased by a mean of 9.4 cm in the fasting group versus 8.8 cm in the nonfasting group, a net difference of 1.8 cm (95% CI, –4.0 to 0.5).

“We found that the two weight-loss regimens that we evaluated had similar success in patients with obesity, regardless of whether they reduced their calorie consumption through time-restricted eating or through calorie restriction alone,” Dr. Liu and colleagues concluded.

Principal investigator Huijie Zhang MD, PhD, professor, chief physician, and deputy director of the department of endocrinology and metabolism at Nafang Hospital, noted that their findings are “consistent with the findings in previous studies.”

“Our data suggest that caloric intake restriction explained most of the beneficial effects of a time-restricted eating regimen,” Dr. Zhang said.

Still, Dr. Zhang called time-restricted eating “a viable and sustainable approach for a person who wants to lose weight.”

More work is needed, Dr. Zhang said, to uncover the impact of fasting in “diverse groups,” including patients with chronic disease like diabetes and cardiovascular disease. Investigators should also conduct studies to compare outcomes between men and women, and evaluate the effects of other fasting durations.
 

 

 

Can trial be applied to a wider population?

According to Blandine Laferrère, MD, PhD, and Satchidananda Panda, PhD, of Columbia University Irving Medical Center, New York, and the Salk Institute for Biological Studies, La Jolla, Calif., respectively, “the results of the trial suggest that calorie restriction combined with time restriction, when delivered with intensive coaching and monitoring, is an approach that is as safe, sustainable, and effective for weight loss as calorie restriction alone.”

Yet Dr. Laferrère and Dr. Panda also expressed skepticism about broader implementation of a similar regime.

“The applicability of this trial to wider populations is debatable,” they wrote in an accompanying editorial. “The short time period for eating at baseline may be specific to the population studied, since investigators outside China have reported longer time windows. The rigorous coaching and monitoring by trial staff also leaves open the question of whether time-restricted eating is easier to adhere to than intentional calorie restriction. Such cost-benefit analyses are important for the assessment of the scalability of a lifestyle intervention.”
 

Duration is trial’s greatest strength

Kristina Varady, PhD, professor of nutrition in the department of kinesiology and nutrition at the University of Illinois at Chicago, said the “key strength” of the trial was its duration, at 12 months, making it the longest time-restricted eating trial to date”; however, she was critical of the design.

Dr. Kristina Varady

“Quite frankly, I’m surprised this study got into such a high-caliber medical journal,” Dr. Varady said in a written comment. “It doesn’t even have a control group! It goes to show how popular these diets are and how much people want to know about them.”

She also noted that “the study was flawed in that it didn’t really look at the effects of true time-restricted eating.” According to Dr. Varady, combining calorie restriction with time-restricted eating “kind of defeats the purpose” of a time-restricted diet.

“The main benefit of time-restricted eating is that you don’t need to count calories in order to lose weight,” Dr. Varady said, citing two of her own studies from 2018 and 2020. “Just by limiting the eating window to 8 hours per day, people naturally cut out 300-500 calories per day. That’s why people like [time-restricted eating] so much.”

Dr. Varady was also “very surprised” at the adherence data. At 1 year, approximately 85% of the patients were still following the protocol, a notably higher rate than most dietary intervention studies, which typically report adherence rates of 50-60%, she said. The high adherence rate was particularly unexpected because of the 8:00 a.m.–4:00 p.m. eating window, Dr. Varady added, since that meant skipping “the family/social meal every evening over 1 whole year!”

The study was funded by the National Key Research and Development Project and others. The study investigators reported no conflicts of interest. Dr. Varady disclosed author fees from the Hachette Book group for her book “The Every Other Day Diet.”

 

Not so fast! Daily fasting with calorie restriction may not lead to shedding more pounds than just cutting back on calories, according to the authors of a new study.

Over the course of a year, study participants who ate only from 8:00 a.m. to 4:00 p.m. did not lose significantly more weight than individuals who ate whenever they wanted, nor did they achieve significantly greater improvements in other obesity-related health measures like body mass index (BMI) or metabolic risk, reported lead author Deying Liu, MD, of Nanfang Hospital, Southern Medical University, Guangzhou, China, and colleagues.

“[Daily fasting] has gained popularity because it is a weight-loss strategy that is simple to follow, which may enhance adherence,” Dr. Liu and colleagues wrote in the New England Journal of Medicine. However, “the long-term efficacy and safety of time-restricted eating as a weight-loss strategy are still uncertain, and the long-term effects on weight loss of time-restricted eating as compared with daily calorie restriction alone have not been fully explored.”

To learn more, Dr. Liu and colleagues recruited 139 adult patients with BMIs between 28 and 45. Individuals with serious medical conditions, such as malignant tumors, diabetes, chronic kidney disease, and others were excluded. Other exclusion criteria included smoking, ongoing participation in a weight-loss program, GI surgery within the prior year, use of medications that impact energy balance and weight, and planned or current pregnancy.

All participants were advised to eat calorie-restricted diets, with ranges of 1,500-1,800 kcal per day for men and 1,200-1,500 kcal per day for women. To determine the added impact of fasting, participants were randomized in a 1:1 ratio into time-restricted (fasting) or non–time-restricted (nonfasting) groups, in which fasting participants ate only during an 8-hour window from 8:00 a.m. to 4:00 p.m., whereas nonfasting participants ate whenever they wanted.

At 6 months and 12 months, participants were re-evaluated for changes in weight, body fat, BMI, blood pressure, lean body mass, and metabolic risk factors, including glucose level, triglycerides, blood pressure, and others.
 

Caloric intake restriction seems to explain most of beneficial effects

At one-year follow-up, 118 participants (84.9%) remained in the study. Although members of the fasting group lost slightly more weight on average than those in the non-fasting group (mean, 8.0 kg vs. 6.3 kg), the difference between groups was not statistically significant (95% confidence interval, −4.0 to 0.4; P = .11).

Most of the other obesity-related health measures also trended toward favoring the fasting group, but again, none of these improvements was statistically significant. Weight circumference at 1 year, for example, decreased by a mean of 9.4 cm in the fasting group versus 8.8 cm in the nonfasting group, a net difference of 1.8 cm (95% CI, –4.0 to 0.5).

“We found that the two weight-loss regimens that we evaluated had similar success in patients with obesity, regardless of whether they reduced their calorie consumption through time-restricted eating or through calorie restriction alone,” Dr. Liu and colleagues concluded.

Principal investigator Huijie Zhang MD, PhD, professor, chief physician, and deputy director of the department of endocrinology and metabolism at Nafang Hospital, noted that their findings are “consistent with the findings in previous studies.”

“Our data suggest that caloric intake restriction explained most of the beneficial effects of a time-restricted eating regimen,” Dr. Zhang said.

Still, Dr. Zhang called time-restricted eating “a viable and sustainable approach for a person who wants to lose weight.”

More work is needed, Dr. Zhang said, to uncover the impact of fasting in “diverse groups,” including patients with chronic disease like diabetes and cardiovascular disease. Investigators should also conduct studies to compare outcomes between men and women, and evaluate the effects of other fasting durations.
 

 

 

Can trial be applied to a wider population?

According to Blandine Laferrère, MD, PhD, and Satchidananda Panda, PhD, of Columbia University Irving Medical Center, New York, and the Salk Institute for Biological Studies, La Jolla, Calif., respectively, “the results of the trial suggest that calorie restriction combined with time restriction, when delivered with intensive coaching and monitoring, is an approach that is as safe, sustainable, and effective for weight loss as calorie restriction alone.”

Yet Dr. Laferrère and Dr. Panda also expressed skepticism about broader implementation of a similar regime.

“The applicability of this trial to wider populations is debatable,” they wrote in an accompanying editorial. “The short time period for eating at baseline may be specific to the population studied, since investigators outside China have reported longer time windows. The rigorous coaching and monitoring by trial staff also leaves open the question of whether time-restricted eating is easier to adhere to than intentional calorie restriction. Such cost-benefit analyses are important for the assessment of the scalability of a lifestyle intervention.”
 

Duration is trial’s greatest strength

Kristina Varady, PhD, professor of nutrition in the department of kinesiology and nutrition at the University of Illinois at Chicago, said the “key strength” of the trial was its duration, at 12 months, making it the longest time-restricted eating trial to date”; however, she was critical of the design.

Dr. Kristina Varady

“Quite frankly, I’m surprised this study got into such a high-caliber medical journal,” Dr. Varady said in a written comment. “It doesn’t even have a control group! It goes to show how popular these diets are and how much people want to know about them.”

She also noted that “the study was flawed in that it didn’t really look at the effects of true time-restricted eating.” According to Dr. Varady, combining calorie restriction with time-restricted eating “kind of defeats the purpose” of a time-restricted diet.

“The main benefit of time-restricted eating is that you don’t need to count calories in order to lose weight,” Dr. Varady said, citing two of her own studies from 2018 and 2020. “Just by limiting the eating window to 8 hours per day, people naturally cut out 300-500 calories per day. That’s why people like [time-restricted eating] so much.”

Dr. Varady was also “very surprised” at the adherence data. At 1 year, approximately 85% of the patients were still following the protocol, a notably higher rate than most dietary intervention studies, which typically report adherence rates of 50-60%, she said. The high adherence rate was particularly unexpected because of the 8:00 a.m.–4:00 p.m. eating window, Dr. Varady added, since that meant skipping “the family/social meal every evening over 1 whole year!”

The study was funded by the National Key Research and Development Project and others. The study investigators reported no conflicts of interest. Dr. Varady disclosed author fees from the Hachette Book group for her book “The Every Other Day Diet.”

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE NEW ENGLAND JOURNAL OF MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Childhood abuse may increase risk of MS in women

Article Type
Changed
Thu, 12/15/2022 - 15:38

Emotional or sexual abuse in childhood may increase risk of multiple sclerosis (MS) in women, and risk may increase further with exposure to multiple kinds of abuse, according to the first prospective cohort study of its kind.

More research is needed to uncover underlying mechanisms of action, according to lead author Karine Eid, MD, a PhD candidate at Haukeland University Hospital, Bergen, Norway, and colleagues.

“Trauma and stressful life events have been associated with an increased risk of autoimmune disorders,” the investigators wrote in the Journal Of Neurology, Neurosurgery, & Psychiatry. “Whether adverse events in childhood can have an impact on MS susceptibility is not known.”

The present study recruited participants from the Norwegian Mother, Father and Child cohort, a population consisting of Norwegian women who were pregnant from 1999 to 2008. Of the 77,997 participating women, 14,477 reported emotional, sexual, and/or physical abuse in childhood, while the remaining 63,520 women reported no abuse. After a mean follow-up of 13 years, 300 women were diagnosed with MS, among whom 24% reported a history of childhood abuse, compared with 19% among women who did not develop MS.

To look for associations between childhood abuse and risk of MS, the investigators used a Cox model adjusted for confounders and mediators, including smoking, obesity, adult socioeconomic factors, and childhood social status. The model revealed that emotional abuse increased the risk of MS by 40% (hazard ratio [HR] 1.40; 95% confidence interval [CI], 1.03-1.90), and sexual abuse increased the risk of MS by 65% (HR 1.65; 95% CI, 1.13-2.39).

Although physical abuse alone did not significantly increase risk of MS (HR 1.31; 95% CI, 0.83-2.06), it did contribute to a dose-response relationship when women were exposed to more than one type of childhood abuse. Women exposed to two out of three abuse categories had a 66% increased risk of MS (HR 1.66; 95% CI, 1.04-2.67), whereas women exposed to all three types of abuse had the highest risk of MS, at 93% (HR 1.93; 95% CI, 1.02-3.67).

Dr. Eid and colleagues noted that their findings are supported by previous retrospective research, and discussed possible mechanisms of action.

“The increased risk of MS after exposure to childhood sexual and emotional abuse may have a biological explanation,” they wrote. “Childhood abuse can cause dysregulation of the hypothalamic-pituitary-adrenal axis, lead to oxidative stress, and induce a proinflammatory state decades into adulthood. Psychological stress has been shown to disrupt the blood-brain barrier and cause epigenetic changes that may increase the risk of neurodegenerative disorders, including MS.

“The underlying mechanisms behind this association should be investigated further,” they concluded.
 

Study findings should guide interventions

Commenting on the research, Ruth Ann Marrie, MD, PhD, professor of medicine and community health sciences and director of the multiple sclerosis clinic at Max Rady College of Medicine, Rady Faculty of Health Sciences, University of Manitoba, Winnipeg, said that the present study “has several strengths compared to prior studies – including that it is prospective and the sample size.”

Dr. Marrie, who was not involved in the study, advised clinicians in the field to take note of the findings, as patients with a history of abuse may need unique interventions.

“Providers need to recognize the higher prevalence of childhood maltreatment in people with MS,” Dr. Marrie said in an interview. “These findings dovetail with others that suggest that adverse childhood experiences are associated with increased mental health concerns and pain catastrophizing in people with MS. Affected individuals may benefit from additional psychological supports and trauma-informed care.”

Tiffany Joy Braley, MD, associate professor of neurology, and Carri Polick, RN and PhD candidate at the school of nursing, University of Michigan, Ann Arbor, who published a case report last year highlighting the importance of evaluating stress exposure in MS, suggested that the findings should guide interventions at both a system and patient level.

“Although a cause-and-effect relationship cannot be established by the current study, these and related findings should be considered in the context of system level and policy interventions that address links between environment and health care disparities,” they said in a joint, written comment. “Given recent impetus to provide trauma-informed health care, these data could be particularly informative in neurological conditions which are associated with high mental health comorbidity. Traumatic stress screening practices could lead to referrals for appropriate support services and more personalized health care.”

While several mechanisms have been proposed to explain the link between traumatic stress and MS, more work is needed in this area, they added.

This knowledge gap was acknowledged by Dr. Marrie.

“Our understanding of the etiology of MS remains incomplete,” Dr. Marrie said. “We still need a better understanding of mechanisms by which adverse childhood experiences lead to MS, how they interact with other risk factors for MS (beyond smoking and obesity), and whether there are any interventions that can mitigate the risk of developing MS that is associated with adverse childhood experiences.”

The investigators disclosed relationships with Novartis, Biogen, Merck, and others. Dr. Marrie receives research support from the Canadian Institutes of Health Research, the National Multiple Sclerosis Society, MS Society of Canada, the Consortium of Multiple Sclerosis Centers, Crohn’s and Colitis Canada, Research Manitoba, and the Arthritis Society; she has no pharmaceutical support. Dr. Braley and Ms. Polick reported no conflicts of interest.

Issue
Neurology Reviews - 30(6)
Publications
Topics
Sections

Emotional or sexual abuse in childhood may increase risk of multiple sclerosis (MS) in women, and risk may increase further with exposure to multiple kinds of abuse, according to the first prospective cohort study of its kind.

More research is needed to uncover underlying mechanisms of action, according to lead author Karine Eid, MD, a PhD candidate at Haukeland University Hospital, Bergen, Norway, and colleagues.

“Trauma and stressful life events have been associated with an increased risk of autoimmune disorders,” the investigators wrote in the Journal Of Neurology, Neurosurgery, & Psychiatry. “Whether adverse events in childhood can have an impact on MS susceptibility is not known.”

The present study recruited participants from the Norwegian Mother, Father and Child cohort, a population consisting of Norwegian women who were pregnant from 1999 to 2008. Of the 77,997 participating women, 14,477 reported emotional, sexual, and/or physical abuse in childhood, while the remaining 63,520 women reported no abuse. After a mean follow-up of 13 years, 300 women were diagnosed with MS, among whom 24% reported a history of childhood abuse, compared with 19% among women who did not develop MS.

To look for associations between childhood abuse and risk of MS, the investigators used a Cox model adjusted for confounders and mediators, including smoking, obesity, adult socioeconomic factors, and childhood social status. The model revealed that emotional abuse increased the risk of MS by 40% (hazard ratio [HR] 1.40; 95% confidence interval [CI], 1.03-1.90), and sexual abuse increased the risk of MS by 65% (HR 1.65; 95% CI, 1.13-2.39).

Although physical abuse alone did not significantly increase risk of MS (HR 1.31; 95% CI, 0.83-2.06), it did contribute to a dose-response relationship when women were exposed to more than one type of childhood abuse. Women exposed to two out of three abuse categories had a 66% increased risk of MS (HR 1.66; 95% CI, 1.04-2.67), whereas women exposed to all three types of abuse had the highest risk of MS, at 93% (HR 1.93; 95% CI, 1.02-3.67).

Dr. Eid and colleagues noted that their findings are supported by previous retrospective research, and discussed possible mechanisms of action.

“The increased risk of MS after exposure to childhood sexual and emotional abuse may have a biological explanation,” they wrote. “Childhood abuse can cause dysregulation of the hypothalamic-pituitary-adrenal axis, lead to oxidative stress, and induce a proinflammatory state decades into adulthood. Psychological stress has been shown to disrupt the blood-brain barrier and cause epigenetic changes that may increase the risk of neurodegenerative disorders, including MS.

“The underlying mechanisms behind this association should be investigated further,” they concluded.
 

Study findings should guide interventions

Commenting on the research, Ruth Ann Marrie, MD, PhD, professor of medicine and community health sciences and director of the multiple sclerosis clinic at Max Rady College of Medicine, Rady Faculty of Health Sciences, University of Manitoba, Winnipeg, said that the present study “has several strengths compared to prior studies – including that it is prospective and the sample size.”

Dr. Marrie, who was not involved in the study, advised clinicians in the field to take note of the findings, as patients with a history of abuse may need unique interventions.

“Providers need to recognize the higher prevalence of childhood maltreatment in people with MS,” Dr. Marrie said in an interview. “These findings dovetail with others that suggest that adverse childhood experiences are associated with increased mental health concerns and pain catastrophizing in people with MS. Affected individuals may benefit from additional psychological supports and trauma-informed care.”

Tiffany Joy Braley, MD, associate professor of neurology, and Carri Polick, RN and PhD candidate at the school of nursing, University of Michigan, Ann Arbor, who published a case report last year highlighting the importance of evaluating stress exposure in MS, suggested that the findings should guide interventions at both a system and patient level.

“Although a cause-and-effect relationship cannot be established by the current study, these and related findings should be considered in the context of system level and policy interventions that address links between environment and health care disparities,” they said in a joint, written comment. “Given recent impetus to provide trauma-informed health care, these data could be particularly informative in neurological conditions which are associated with high mental health comorbidity. Traumatic stress screening practices could lead to referrals for appropriate support services and more personalized health care.”

While several mechanisms have been proposed to explain the link between traumatic stress and MS, more work is needed in this area, they added.

This knowledge gap was acknowledged by Dr. Marrie.

“Our understanding of the etiology of MS remains incomplete,” Dr. Marrie said. “We still need a better understanding of mechanisms by which adverse childhood experiences lead to MS, how they interact with other risk factors for MS (beyond smoking and obesity), and whether there are any interventions that can mitigate the risk of developing MS that is associated with adverse childhood experiences.”

The investigators disclosed relationships with Novartis, Biogen, Merck, and others. Dr. Marrie receives research support from the Canadian Institutes of Health Research, the National Multiple Sclerosis Society, MS Society of Canada, the Consortium of Multiple Sclerosis Centers, Crohn’s and Colitis Canada, Research Manitoba, and the Arthritis Society; she has no pharmaceutical support. Dr. Braley and Ms. Polick reported no conflicts of interest.

Emotional or sexual abuse in childhood may increase risk of multiple sclerosis (MS) in women, and risk may increase further with exposure to multiple kinds of abuse, according to the first prospective cohort study of its kind.

More research is needed to uncover underlying mechanisms of action, according to lead author Karine Eid, MD, a PhD candidate at Haukeland University Hospital, Bergen, Norway, and colleagues.

“Trauma and stressful life events have been associated with an increased risk of autoimmune disorders,” the investigators wrote in the Journal Of Neurology, Neurosurgery, & Psychiatry. “Whether adverse events in childhood can have an impact on MS susceptibility is not known.”

The present study recruited participants from the Norwegian Mother, Father and Child cohort, a population consisting of Norwegian women who were pregnant from 1999 to 2008. Of the 77,997 participating women, 14,477 reported emotional, sexual, and/or physical abuse in childhood, while the remaining 63,520 women reported no abuse. After a mean follow-up of 13 years, 300 women were diagnosed with MS, among whom 24% reported a history of childhood abuse, compared with 19% among women who did not develop MS.

To look for associations between childhood abuse and risk of MS, the investigators used a Cox model adjusted for confounders and mediators, including smoking, obesity, adult socioeconomic factors, and childhood social status. The model revealed that emotional abuse increased the risk of MS by 40% (hazard ratio [HR] 1.40; 95% confidence interval [CI], 1.03-1.90), and sexual abuse increased the risk of MS by 65% (HR 1.65; 95% CI, 1.13-2.39).

Although physical abuse alone did not significantly increase risk of MS (HR 1.31; 95% CI, 0.83-2.06), it did contribute to a dose-response relationship when women were exposed to more than one type of childhood abuse. Women exposed to two out of three abuse categories had a 66% increased risk of MS (HR 1.66; 95% CI, 1.04-2.67), whereas women exposed to all three types of abuse had the highest risk of MS, at 93% (HR 1.93; 95% CI, 1.02-3.67).

Dr. Eid and colleagues noted that their findings are supported by previous retrospective research, and discussed possible mechanisms of action.

“The increased risk of MS after exposure to childhood sexual and emotional abuse may have a biological explanation,” they wrote. “Childhood abuse can cause dysregulation of the hypothalamic-pituitary-adrenal axis, lead to oxidative stress, and induce a proinflammatory state decades into adulthood. Psychological stress has been shown to disrupt the blood-brain barrier and cause epigenetic changes that may increase the risk of neurodegenerative disorders, including MS.

“The underlying mechanisms behind this association should be investigated further,” they concluded.
 

Study findings should guide interventions

Commenting on the research, Ruth Ann Marrie, MD, PhD, professor of medicine and community health sciences and director of the multiple sclerosis clinic at Max Rady College of Medicine, Rady Faculty of Health Sciences, University of Manitoba, Winnipeg, said that the present study “has several strengths compared to prior studies – including that it is prospective and the sample size.”

Dr. Marrie, who was not involved in the study, advised clinicians in the field to take note of the findings, as patients with a history of abuse may need unique interventions.

“Providers need to recognize the higher prevalence of childhood maltreatment in people with MS,” Dr. Marrie said in an interview. “These findings dovetail with others that suggest that adverse childhood experiences are associated with increased mental health concerns and pain catastrophizing in people with MS. Affected individuals may benefit from additional psychological supports and trauma-informed care.”

Tiffany Joy Braley, MD, associate professor of neurology, and Carri Polick, RN and PhD candidate at the school of nursing, University of Michigan, Ann Arbor, who published a case report last year highlighting the importance of evaluating stress exposure in MS, suggested that the findings should guide interventions at both a system and patient level.

“Although a cause-and-effect relationship cannot be established by the current study, these and related findings should be considered in the context of system level and policy interventions that address links between environment and health care disparities,” they said in a joint, written comment. “Given recent impetus to provide trauma-informed health care, these data could be particularly informative in neurological conditions which are associated with high mental health comorbidity. Traumatic stress screening practices could lead to referrals for appropriate support services and more personalized health care.”

While several mechanisms have been proposed to explain the link between traumatic stress and MS, more work is needed in this area, they added.

This knowledge gap was acknowledged by Dr. Marrie.

“Our understanding of the etiology of MS remains incomplete,” Dr. Marrie said. “We still need a better understanding of mechanisms by which adverse childhood experiences lead to MS, how they interact with other risk factors for MS (beyond smoking and obesity), and whether there are any interventions that can mitigate the risk of developing MS that is associated with adverse childhood experiences.”

The investigators disclosed relationships with Novartis, Biogen, Merck, and others. Dr. Marrie receives research support from the Canadian Institutes of Health Research, the National Multiple Sclerosis Society, MS Society of Canada, the Consortium of Multiple Sclerosis Centers, Crohn’s and Colitis Canada, Research Manitoba, and the Arthritis Society; she has no pharmaceutical support. Dr. Braley and Ms. Polick reported no conflicts of interest.

Issue
Neurology Reviews - 30(6)
Issue
Neurology Reviews - 30(6)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF NEUROLOGY, NEUROSURGERY, & PSYCHIATRY

Citation Override
Publish date: April 20, 2022
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Real-world data suggest coprescribing PDE5 inhibitors and nitrates may be safe

Article Type
Changed
Thu, 04/21/2022 - 09:00

As coprescribing drugs for erectile dysfunction and oral organic nitrates for ischemic heart disease (IHD) surged, cardiovascular adverse events did not significantly increase, a new study finds.

The authors of the new research specifically examined how frequently phosphodiesterase type 5 (PDE5) inhibitors, such as Viagra, were prescribed. The U.S. Food and Drug Administration and the European Medicines Agency have warned that these drugs for erectile dysfunction are contraindicated for use with nitrates because of concerns about cardiovascular risks.

“Small, randomized, pharmacologic studies have reported an amplified decrease in blood pressure during controlled coexposure with nitrates and [phosphodiesterase type 5 inhibitors], both in healthy participants and in participants with IHD,” wrote lead author Anders Holt, MD, of Copenhagen University Hospital–Herlev and Gentofte and colleagues, in Annals of Internal Medicine. “Potentially, this increases the risk for vascular ischemic events including myocardial infarction and stroke.”

But there is a scarcity of real-world data showing that using both types of drugs together increase these risks, the researchers noted.

To address this knowledge gap, Dr. Holt and colleagues conducted a retrospective study involving 249,541 Danish men with IHD. In this overall population, from 2000 to 2018, prescriptions for PDE5 inhibitors increased 10-fold, from 3.1 to 30.9 prescriptions per 100 persons per year. Within a subgroup of 42,073 patients continuously prescribed oral organic nitrates, PDE5-inhibitor prescriptions jumped twice that magnitude, from 0.9 to 19.7 prescriptions per 100 persons per year.

Despite this surge in coprescribing, the investigators did not observe a significant increase in either of two composite measures of cardiovascular adverse events. The first composite included ischemic stroke, shock, cardiac arrest, myocardial infarction, or acute coronary arteriography (odds ratio, 0.58; 95% confidence interval, 0.28-1.13). The second composite included drug-related adverse events, angina pectoris, or syncope (OR, 0.73; CI, 0.40-1.32).
 

Lead author speculates on reasons for findings

“I propose several explanations [for these findings],” Dr. Holt said in an interview, “but I want to emphasize that our study does not contain any data to back it up. It is just speculation. First, the observed drop in blood pressure may not cause a condition for which patients seek a hospital. A drop in blood pressure has been shown in pharmacologic trials, but it might not translate to a real-life risk for cardiovascular outcomes. Second, patients could be well informed and adherent to guidance that the prescribing physician has provided. For example, patients are aware of the recommended pause in nitrate treatment before PDE5-inhibitor use and follow these recommendations. Third, nitrates are often taken in the morning, and with the careful assumption that most PDE5-inhibitor activities take place in the evening, the nitrates could be metabolized to a degree such that the synergistic interaction is negligible.”

Dr. Holt went on to suggest a novel clinical approach based on the new findings.

“Coadministration should still be contraindicated due to the proven drop in blood pressure,” he said. “However, perhaps physicians can allow for coprescription if patients are adequately informed.”

A qualitative study is needed to determine how patients and physicians discuss coprescription, including avoidance of coadministration, Dr. Holt added.
 

 

 

Findings call for a reassessment of whether the contraindication is warranted

Robert A. Kloner, MD, PhD, chief science officer at the Huntington Medical Research Institutes in Pasadena, Calif., and professor of medicine at University of Southern California, Los Angeles, previously conducted research exploring drug interactions with PDE5 inhibitors, and in 2018, coauthored a literature review that concluded that PDE5 inhibitors and nitrates are contraindicated.

But now, considering these new findings, Dr. Kloner is offering a fresh perspective.

“This study is reassuring,” Dr. Kloner said in an interview. “I think that it’s time to reassess whether there should be an absolute contraindication, or this should be more of like a warning.”

He noted that in controlled studies, like the ones he previously conducted, PDE5 inhibitors and nitrates were administered “very close to each other, on purpose,” yet this probably doesn’t reflect typical practice, in which clinicians can guide usage based on durations of drug metabolism.

“I think that physicians might be more comfortable now prescribing the drugs at the same time, but then telling patients that they shouldn’t take the two drugs simultaneously; they should wait and take the nitrate 24 hours after the last Viagra, or the nitrate 48 hours after the last Cialis,” Dr. Kloner said. “I suspect that that is happening. I suspect also the fact that people would be more likely to take the nitrate in the morning and the PDE5 inhibitor at night probably also contributes to the safety findings.”

Dr. Kloner noted that blood pressures vary throughout the day based on circadian rhythm, and that the body can adapt to some fluctuations without negative effects.

There could still be some people who experience a drop in blood pressure and get sick from it from the two drugs interacting, but that’s probably not that common, he said.

The study was supported by several grants. The investigators disclosed relationships with Merck, BMS, Bayer, and others. Dr. Kloner consults for Sanofi.

Publications
Topics
Sections

As coprescribing drugs for erectile dysfunction and oral organic nitrates for ischemic heart disease (IHD) surged, cardiovascular adverse events did not significantly increase, a new study finds.

The authors of the new research specifically examined how frequently phosphodiesterase type 5 (PDE5) inhibitors, such as Viagra, were prescribed. The U.S. Food and Drug Administration and the European Medicines Agency have warned that these drugs for erectile dysfunction are contraindicated for use with nitrates because of concerns about cardiovascular risks.

“Small, randomized, pharmacologic studies have reported an amplified decrease in blood pressure during controlled coexposure with nitrates and [phosphodiesterase type 5 inhibitors], both in healthy participants and in participants with IHD,” wrote lead author Anders Holt, MD, of Copenhagen University Hospital–Herlev and Gentofte and colleagues, in Annals of Internal Medicine. “Potentially, this increases the risk for vascular ischemic events including myocardial infarction and stroke.”

But there is a scarcity of real-world data showing that using both types of drugs together increase these risks, the researchers noted.

To address this knowledge gap, Dr. Holt and colleagues conducted a retrospective study involving 249,541 Danish men with IHD. In this overall population, from 2000 to 2018, prescriptions for PDE5 inhibitors increased 10-fold, from 3.1 to 30.9 prescriptions per 100 persons per year. Within a subgroup of 42,073 patients continuously prescribed oral organic nitrates, PDE5-inhibitor prescriptions jumped twice that magnitude, from 0.9 to 19.7 prescriptions per 100 persons per year.

Despite this surge in coprescribing, the investigators did not observe a significant increase in either of two composite measures of cardiovascular adverse events. The first composite included ischemic stroke, shock, cardiac arrest, myocardial infarction, or acute coronary arteriography (odds ratio, 0.58; 95% confidence interval, 0.28-1.13). The second composite included drug-related adverse events, angina pectoris, or syncope (OR, 0.73; CI, 0.40-1.32).
 

Lead author speculates on reasons for findings

“I propose several explanations [for these findings],” Dr. Holt said in an interview, “but I want to emphasize that our study does not contain any data to back it up. It is just speculation. First, the observed drop in blood pressure may not cause a condition for which patients seek a hospital. A drop in blood pressure has been shown in pharmacologic trials, but it might not translate to a real-life risk for cardiovascular outcomes. Second, patients could be well informed and adherent to guidance that the prescribing physician has provided. For example, patients are aware of the recommended pause in nitrate treatment before PDE5-inhibitor use and follow these recommendations. Third, nitrates are often taken in the morning, and with the careful assumption that most PDE5-inhibitor activities take place in the evening, the nitrates could be metabolized to a degree such that the synergistic interaction is negligible.”

Dr. Holt went on to suggest a novel clinical approach based on the new findings.

“Coadministration should still be contraindicated due to the proven drop in blood pressure,” he said. “However, perhaps physicians can allow for coprescription if patients are adequately informed.”

A qualitative study is needed to determine how patients and physicians discuss coprescription, including avoidance of coadministration, Dr. Holt added.
 

 

 

Findings call for a reassessment of whether the contraindication is warranted

Robert A. Kloner, MD, PhD, chief science officer at the Huntington Medical Research Institutes in Pasadena, Calif., and professor of medicine at University of Southern California, Los Angeles, previously conducted research exploring drug interactions with PDE5 inhibitors, and in 2018, coauthored a literature review that concluded that PDE5 inhibitors and nitrates are contraindicated.

But now, considering these new findings, Dr. Kloner is offering a fresh perspective.

“This study is reassuring,” Dr. Kloner said in an interview. “I think that it’s time to reassess whether there should be an absolute contraindication, or this should be more of like a warning.”

He noted that in controlled studies, like the ones he previously conducted, PDE5 inhibitors and nitrates were administered “very close to each other, on purpose,” yet this probably doesn’t reflect typical practice, in which clinicians can guide usage based on durations of drug metabolism.

“I think that physicians might be more comfortable now prescribing the drugs at the same time, but then telling patients that they shouldn’t take the two drugs simultaneously; they should wait and take the nitrate 24 hours after the last Viagra, or the nitrate 48 hours after the last Cialis,” Dr. Kloner said. “I suspect that that is happening. I suspect also the fact that people would be more likely to take the nitrate in the morning and the PDE5 inhibitor at night probably also contributes to the safety findings.”

Dr. Kloner noted that blood pressures vary throughout the day based on circadian rhythm, and that the body can adapt to some fluctuations without negative effects.

There could still be some people who experience a drop in blood pressure and get sick from it from the two drugs interacting, but that’s probably not that common, he said.

The study was supported by several grants. The investigators disclosed relationships with Merck, BMS, Bayer, and others. Dr. Kloner consults for Sanofi.

As coprescribing drugs for erectile dysfunction and oral organic nitrates for ischemic heart disease (IHD) surged, cardiovascular adverse events did not significantly increase, a new study finds.

The authors of the new research specifically examined how frequently phosphodiesterase type 5 (PDE5) inhibitors, such as Viagra, were prescribed. The U.S. Food and Drug Administration and the European Medicines Agency have warned that these drugs for erectile dysfunction are contraindicated for use with nitrates because of concerns about cardiovascular risks.

“Small, randomized, pharmacologic studies have reported an amplified decrease in blood pressure during controlled coexposure with nitrates and [phosphodiesterase type 5 inhibitors], both in healthy participants and in participants with IHD,” wrote lead author Anders Holt, MD, of Copenhagen University Hospital–Herlev and Gentofte and colleagues, in Annals of Internal Medicine. “Potentially, this increases the risk for vascular ischemic events including myocardial infarction and stroke.”

But there is a scarcity of real-world data showing that using both types of drugs together increase these risks, the researchers noted.

To address this knowledge gap, Dr. Holt and colleagues conducted a retrospective study involving 249,541 Danish men with IHD. In this overall population, from 2000 to 2018, prescriptions for PDE5 inhibitors increased 10-fold, from 3.1 to 30.9 prescriptions per 100 persons per year. Within a subgroup of 42,073 patients continuously prescribed oral organic nitrates, PDE5-inhibitor prescriptions jumped twice that magnitude, from 0.9 to 19.7 prescriptions per 100 persons per year.

Despite this surge in coprescribing, the investigators did not observe a significant increase in either of two composite measures of cardiovascular adverse events. The first composite included ischemic stroke, shock, cardiac arrest, myocardial infarction, or acute coronary arteriography (odds ratio, 0.58; 95% confidence interval, 0.28-1.13). The second composite included drug-related adverse events, angina pectoris, or syncope (OR, 0.73; CI, 0.40-1.32).
 

Lead author speculates on reasons for findings

“I propose several explanations [for these findings],” Dr. Holt said in an interview, “but I want to emphasize that our study does not contain any data to back it up. It is just speculation. First, the observed drop in blood pressure may not cause a condition for which patients seek a hospital. A drop in blood pressure has been shown in pharmacologic trials, but it might not translate to a real-life risk for cardiovascular outcomes. Second, patients could be well informed and adherent to guidance that the prescribing physician has provided. For example, patients are aware of the recommended pause in nitrate treatment before PDE5-inhibitor use and follow these recommendations. Third, nitrates are often taken in the morning, and with the careful assumption that most PDE5-inhibitor activities take place in the evening, the nitrates could be metabolized to a degree such that the synergistic interaction is negligible.”

Dr. Holt went on to suggest a novel clinical approach based on the new findings.

“Coadministration should still be contraindicated due to the proven drop in blood pressure,” he said. “However, perhaps physicians can allow for coprescription if patients are adequately informed.”

A qualitative study is needed to determine how patients and physicians discuss coprescription, including avoidance of coadministration, Dr. Holt added.
 

 

 

Findings call for a reassessment of whether the contraindication is warranted

Robert A. Kloner, MD, PhD, chief science officer at the Huntington Medical Research Institutes in Pasadena, Calif., and professor of medicine at University of Southern California, Los Angeles, previously conducted research exploring drug interactions with PDE5 inhibitors, and in 2018, coauthored a literature review that concluded that PDE5 inhibitors and nitrates are contraindicated.

But now, considering these new findings, Dr. Kloner is offering a fresh perspective.

“This study is reassuring,” Dr. Kloner said in an interview. “I think that it’s time to reassess whether there should be an absolute contraindication, or this should be more of like a warning.”

He noted that in controlled studies, like the ones he previously conducted, PDE5 inhibitors and nitrates were administered “very close to each other, on purpose,” yet this probably doesn’t reflect typical practice, in which clinicians can guide usage based on durations of drug metabolism.

“I think that physicians might be more comfortable now prescribing the drugs at the same time, but then telling patients that they shouldn’t take the two drugs simultaneously; they should wait and take the nitrate 24 hours after the last Viagra, or the nitrate 48 hours after the last Cialis,” Dr. Kloner said. “I suspect that that is happening. I suspect also the fact that people would be more likely to take the nitrate in the morning and the PDE5 inhibitor at night probably also contributes to the safety findings.”

Dr. Kloner noted that blood pressures vary throughout the day based on circadian rhythm, and that the body can adapt to some fluctuations without negative effects.

There could still be some people who experience a drop in blood pressure and get sick from it from the two drugs interacting, but that’s probably not that common, he said.

The study was supported by several grants. The investigators disclosed relationships with Merck, BMS, Bayer, and others. Dr. Kloner consults for Sanofi.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ANNALS OF INTERNAL MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Study: Physical fitness in children linked with concentration, quality of life

Article Type
Changed
Mon, 04/11/2022 - 11:06

Physically fit children have a greater ability to concentrate and better health-related quality of life (HRQOL), according to a new study.

The findings of the German study involving more than 6,500 kids emphasize the importance of cardiorespiratory health in childhood, and support physical fitness initiatives in schools, according to lead author Katharina Köble, MSc, of the Technical University of Munich (Germany), and colleagues.

“Recent studies show that only a few children meet the recommendations of physical activity,” the investigators wrote in Journal of Clinical Medicine.

While the health benefits of physical activity are clearly documented, Ms. Köble and colleagues noted that typical measures of activity, such as accelerometers or self-reported questionnaires, are suboptimal research tools.

“Physical fitness is a more objective parameter to quantify when evaluating health promotion,” the investigators wrote. “Furthermore, cardiorespiratory fitness as part of physical fitness is more strongly related to risk factors of cardiovascular disease than physical activity.”

According to the investigators, physical fitness has also been linked with better concentration and HRQOL, but never in the same population of children.

The new study aimed to address this knowledge gap by assessing 6,533 healthy children aged 6-10 years, approximately half boys and half girls. Associations between physical fitness, concentration, and HRQOL were evaluated using multiple linear regression analysis in participants aged 9-10 years.

Physical fitness was measured using a series of challenges, including curl-ups (pull-ups with palms facing body), push-ups, standing long jump, handgrip strength measurement, and Progressive Aerobic Cardiovascular Endurance Run (PACER). Performing the multistage shuttle run, PACER, “requires participants to maintain the pace set by an audio signal, which progressively increases the intensity every minute.” Results of the PACER test were used to estimate VO2max.

Concentration was measured using the d2-R test, “a paper-pencil cancellation test, where subjects have to cross out all ‘d’ letters with two dashes under a time limit.”

HRQOL was evaluated with the KINDL questionnaire, which covers emotional well-being, physical well-being, everyday functioning (school), friends, family, and self-esteem.

Analysis showed that physical fitness improved with age (P < .001), except for VO2max in girls (P = .129). Concentration also improved with age (P < .001), while HRQOL did not (P = .179).

Among children aged 9-10 years, VO2max scores were strongly associated with both HRQOL (P < .001) and concentration (P < .001).

“VO2max was found to be one of the main factors influencing concentration levels and HRQOL dimensions in primary school children,” the investigators wrote. “Physical fitness, especially cardiorespiratory performance, should therefore be promoted more specifically in school settings to support the promotion of an overall healthy lifestyle in children and adolescents.”
 

Findings are having a real-word impact, according to researcher

In an interview, Ms. Köble noted that the findings are already having a real-world impact.

“We continued data assessment in the long-term and specifically adapted prevention programs in school to the needs of the school children we identified in our study,” she said. “Schools are partially offering specific movement and nutrition classes now.”

In addition, Ms. Köble and colleagues plan on educating teachers about the “urgent need for sufficient physical activity.”

“Academic performance should be considered as an additional health factor in future studies, as well as screen time and eating patterns, as all those variables showed interactions with physical fitness and concentration. In a subanalysis, we showed that children with better physical fitness and concentration values were those who usually went to higher education secondary schools,” they wrote.
 

 

 

VO2max did not correlate with BMI

Gregory Weaver, MD, a pediatrician at Cleveland Clinic Children’s, voiced some concerns about the reliability of the findings. He noted that VO2max did not correlate with body mass index or other measures of physical fitness, and that using the PACER test to estimate VO2max may have skewed the association between physical fitness and concentration.

“It is quite conceivable that children who can maintain the focus to perform maximally on this test will also do well on other tests of attention/concentration,” Dr. Weaver said. “Most children I know would have a very difficult time performing a physical fitness test which requires them to match a recorded pace that slowly increases overtime. I’m not an expert in the area, but it is my understanding that usually VO2max tests involve a treadmill which allows investigators to have complete control over pace.”

Dr. Weaver concluded that more work is needed to determine if physical fitness interventions can have a positive impact on HRQOL and concentration.

“I think the authors of this study attempted to ask an important question about the possible association between physical fitness and concentration among school aged children,” Dr. Weaver said in an interview. “But what is even more vital are studies demonstrating that a change in modifiable health factors like nutrition, physical fitness, or the built environment can improve quality of life. I was hoping the authors would show that an improvement in VO2max over time resulted in an improvement in concentration. Frustratingly, that is not what this article demonstrates.”

The investigators and Dr. Weaver reported no conflicts of interest.

Publications
Topics
Sections

Physically fit children have a greater ability to concentrate and better health-related quality of life (HRQOL), according to a new study.

The findings of the German study involving more than 6,500 kids emphasize the importance of cardiorespiratory health in childhood, and support physical fitness initiatives in schools, according to lead author Katharina Köble, MSc, of the Technical University of Munich (Germany), and colleagues.

“Recent studies show that only a few children meet the recommendations of physical activity,” the investigators wrote in Journal of Clinical Medicine.

While the health benefits of physical activity are clearly documented, Ms. Köble and colleagues noted that typical measures of activity, such as accelerometers or self-reported questionnaires, are suboptimal research tools.

“Physical fitness is a more objective parameter to quantify when evaluating health promotion,” the investigators wrote. “Furthermore, cardiorespiratory fitness as part of physical fitness is more strongly related to risk factors of cardiovascular disease than physical activity.”

According to the investigators, physical fitness has also been linked with better concentration and HRQOL, but never in the same population of children.

The new study aimed to address this knowledge gap by assessing 6,533 healthy children aged 6-10 years, approximately half boys and half girls. Associations between physical fitness, concentration, and HRQOL were evaluated using multiple linear regression analysis in participants aged 9-10 years.

Physical fitness was measured using a series of challenges, including curl-ups (pull-ups with palms facing body), push-ups, standing long jump, handgrip strength measurement, and Progressive Aerobic Cardiovascular Endurance Run (PACER). Performing the multistage shuttle run, PACER, “requires participants to maintain the pace set by an audio signal, which progressively increases the intensity every minute.” Results of the PACER test were used to estimate VO2max.

Concentration was measured using the d2-R test, “a paper-pencil cancellation test, where subjects have to cross out all ‘d’ letters with two dashes under a time limit.”

HRQOL was evaluated with the KINDL questionnaire, which covers emotional well-being, physical well-being, everyday functioning (school), friends, family, and self-esteem.

Analysis showed that physical fitness improved with age (P < .001), except for VO2max in girls (P = .129). Concentration also improved with age (P < .001), while HRQOL did not (P = .179).

Among children aged 9-10 years, VO2max scores were strongly associated with both HRQOL (P < .001) and concentration (P < .001).

“VO2max was found to be one of the main factors influencing concentration levels and HRQOL dimensions in primary school children,” the investigators wrote. “Physical fitness, especially cardiorespiratory performance, should therefore be promoted more specifically in school settings to support the promotion of an overall healthy lifestyle in children and adolescents.”
 

Findings are having a real-word impact, according to researcher

In an interview, Ms. Köble noted that the findings are already having a real-world impact.

“We continued data assessment in the long-term and specifically adapted prevention programs in school to the needs of the school children we identified in our study,” she said. “Schools are partially offering specific movement and nutrition classes now.”

In addition, Ms. Köble and colleagues plan on educating teachers about the “urgent need for sufficient physical activity.”

“Academic performance should be considered as an additional health factor in future studies, as well as screen time and eating patterns, as all those variables showed interactions with physical fitness and concentration. In a subanalysis, we showed that children with better physical fitness and concentration values were those who usually went to higher education secondary schools,” they wrote.
 

 

 

VO2max did not correlate with BMI

Gregory Weaver, MD, a pediatrician at Cleveland Clinic Children’s, voiced some concerns about the reliability of the findings. He noted that VO2max did not correlate with body mass index or other measures of physical fitness, and that using the PACER test to estimate VO2max may have skewed the association between physical fitness and concentration.

“It is quite conceivable that children who can maintain the focus to perform maximally on this test will also do well on other tests of attention/concentration,” Dr. Weaver said. “Most children I know would have a very difficult time performing a physical fitness test which requires them to match a recorded pace that slowly increases overtime. I’m not an expert in the area, but it is my understanding that usually VO2max tests involve a treadmill which allows investigators to have complete control over pace.”

Dr. Weaver concluded that more work is needed to determine if physical fitness interventions can have a positive impact on HRQOL and concentration.

“I think the authors of this study attempted to ask an important question about the possible association between physical fitness and concentration among school aged children,” Dr. Weaver said in an interview. “But what is even more vital are studies demonstrating that a change in modifiable health factors like nutrition, physical fitness, or the built environment can improve quality of life. I was hoping the authors would show that an improvement in VO2max over time resulted in an improvement in concentration. Frustratingly, that is not what this article demonstrates.”

The investigators and Dr. Weaver reported no conflicts of interest.

Physically fit children have a greater ability to concentrate and better health-related quality of life (HRQOL), according to a new study.

The findings of the German study involving more than 6,500 kids emphasize the importance of cardiorespiratory health in childhood, and support physical fitness initiatives in schools, according to lead author Katharina Köble, MSc, of the Technical University of Munich (Germany), and colleagues.

“Recent studies show that only a few children meet the recommendations of physical activity,” the investigators wrote in Journal of Clinical Medicine.

While the health benefits of physical activity are clearly documented, Ms. Köble and colleagues noted that typical measures of activity, such as accelerometers or self-reported questionnaires, are suboptimal research tools.

“Physical fitness is a more objective parameter to quantify when evaluating health promotion,” the investigators wrote. “Furthermore, cardiorespiratory fitness as part of physical fitness is more strongly related to risk factors of cardiovascular disease than physical activity.”

According to the investigators, physical fitness has also been linked with better concentration and HRQOL, but never in the same population of children.

The new study aimed to address this knowledge gap by assessing 6,533 healthy children aged 6-10 years, approximately half boys and half girls. Associations between physical fitness, concentration, and HRQOL were evaluated using multiple linear regression analysis in participants aged 9-10 years.

Physical fitness was measured using a series of challenges, including curl-ups (pull-ups with palms facing body), push-ups, standing long jump, handgrip strength measurement, and Progressive Aerobic Cardiovascular Endurance Run (PACER). Performing the multistage shuttle run, PACER, “requires participants to maintain the pace set by an audio signal, which progressively increases the intensity every minute.” Results of the PACER test were used to estimate VO2max.

Concentration was measured using the d2-R test, “a paper-pencil cancellation test, where subjects have to cross out all ‘d’ letters with two dashes under a time limit.”

HRQOL was evaluated with the KINDL questionnaire, which covers emotional well-being, physical well-being, everyday functioning (school), friends, family, and self-esteem.

Analysis showed that physical fitness improved with age (P < .001), except for VO2max in girls (P = .129). Concentration also improved with age (P < .001), while HRQOL did not (P = .179).

Among children aged 9-10 years, VO2max scores were strongly associated with both HRQOL (P < .001) and concentration (P < .001).

“VO2max was found to be one of the main factors influencing concentration levels and HRQOL dimensions in primary school children,” the investigators wrote. “Physical fitness, especially cardiorespiratory performance, should therefore be promoted more specifically in school settings to support the promotion of an overall healthy lifestyle in children and adolescents.”
 

Findings are having a real-word impact, according to researcher

In an interview, Ms. Köble noted that the findings are already having a real-world impact.

“We continued data assessment in the long-term and specifically adapted prevention programs in school to the needs of the school children we identified in our study,” she said. “Schools are partially offering specific movement and nutrition classes now.”

In addition, Ms. Köble and colleagues plan on educating teachers about the “urgent need for sufficient physical activity.”

“Academic performance should be considered as an additional health factor in future studies, as well as screen time and eating patterns, as all those variables showed interactions with physical fitness and concentration. In a subanalysis, we showed that children with better physical fitness and concentration values were those who usually went to higher education secondary schools,” they wrote.
 

 

 

VO2max did not correlate with BMI

Gregory Weaver, MD, a pediatrician at Cleveland Clinic Children’s, voiced some concerns about the reliability of the findings. He noted that VO2max did not correlate with body mass index or other measures of physical fitness, and that using the PACER test to estimate VO2max may have skewed the association between physical fitness and concentration.

“It is quite conceivable that children who can maintain the focus to perform maximally on this test will also do well on other tests of attention/concentration,” Dr. Weaver said. “Most children I know would have a very difficult time performing a physical fitness test which requires them to match a recorded pace that slowly increases overtime. I’m not an expert in the area, but it is my understanding that usually VO2max tests involve a treadmill which allows investigators to have complete control over pace.”

Dr. Weaver concluded that more work is needed to determine if physical fitness interventions can have a positive impact on HRQOL and concentration.

“I think the authors of this study attempted to ask an important question about the possible association between physical fitness and concentration among school aged children,” Dr. Weaver said in an interview. “But what is even more vital are studies demonstrating that a change in modifiable health factors like nutrition, physical fitness, or the built environment can improve quality of life. I was hoping the authors would show that an improvement in VO2max over time resulted in an improvement in concentration. Frustratingly, that is not what this article demonstrates.”

The investigators and Dr. Weaver reported no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF CLINICAL MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

New HBV model may open door to more effective antivirals

Long–sought-after breakthrough?
Article Type
Changed
Tue, 03/15/2022 - 17:03

A new mouse model that better represents chronic infection with hepatitis B virus (HBV) in humans may lead to more effective antiviral therapies for HBV, according to investigators.

During human infection, HBV genomes take the form of covalently closed circular DNA (cccDNA), a structure that has thwarted effective antiviral therapy and, until now, creation of an accurate mouse model, reported lead author Zaichao Xu, PhD, of Wuhan (China) University and colleagues.

“As the viral persistence reservoir plays a central role in HBV infection, HBV cccDNA is the key obstacle for a cure,” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology.

Although several previous mouse models have approximated this phenomenon with recombinant cccDNA-like molecules (rcccDNA), the present model is the first to achieve genuine cccDNA, which does not naturally occur in mice.

“Although rcccDNA supports persistent viral replication and antigen expression, the nature of rcccDNA may differ from authentic cccDNA, as additional sequences, like LoxP or attR, were inserted into the HBV genome,” the investigators noted.

The new model was created by first constructing an adeno-associated virus vector carrying a replication-deficient HBV1.04-fold genome (AAV-HBV1.04). When injected into mice, the vector led to cccDNA formation via ataxia-telangiectasia and Rad3-related protein (ATR)–mediated DNA damage response, a finding that was confirmed by blocking the same process with ATR inhibitors.

Immediately after injection, mice tested positive for both hepatitis B e antigen (HBeAg) and hepatitis B surface antigen (HBsAg), with peak concentrations after either 4 or 8 weeks depending on dose. HBV DNA was also detected in serum after injection, and 50% of hepatocytes exhibited HBsAg and hepatitis B core protein (HBc) after 1 week. At week 66, HBsAg, HBeAg, and HBc were still detectable in the liver.

“The expression of HBc could only be observed in the liver, but not in other organs or tissues, suggesting that the AAV-HBV1.04 only targeted the mouse liver,” the investigators wrote.

Further experimentation involving known cccDNA-binding proteins supported the similarity between cccDNA in the mouse model and natural infection.

“These results suggested that the chromatinization and transcriptional activation of cccDNA formed in this model dose not differ from wild-type cccDNA formed through infection.”

Next, Dr. Xu and colleagues demonstrated that the infected mice could serve as a reliable model for antiviral research. One week after injection with the vector, mice were treated with entecavir, polyinosinic-polycytidylic acid (poly[I:C]), or phosphate-buffered saline (PBS; control). As anticipated, entecavir suppressed circulating HBV DNA, but not HBsAg, HBeAg, or HBV cccDNA, whereas treatment with poly(I:C) reduced all HBV markers.

“This novel mouse model will provide a unique platform for studying HBV cccDNA and developing novel antivirals to achieve HBV cure,” the investigators concluded.

The study was supported by the National Natural Science Foundation of China, the Fundamental Research Funds for the Central Universities, Hubei Province’s Outstanding Medical Academic Leader Program, and others. The investigators reported no conflicts of interest.

Body

 

On the heels of the wondrous development of curative antiviral agents for hepatitis C virus (HCV), renewed attention has been directed to efforts to bring about the cure of HBV. However, this task will hinge on successful elimination of covalently closed circular DNA (cccDNA), a highly stable form of viral DNA that is exceedingly difficult to eliminate. Efforts to develop successful curative strategies will in turn rely on development of small animal models that support HBV cccDNA formation and virus production, which has until recently proved elusive. In the past several years, several mouse HBV models supporting cccDNA formation have been constructed using adeno-associated vector (AAV)–mediated transduction of a linearized HBV genome. Both the AAV-HBV linear episome and cccDNA have been consistently replicated and detected in these models. While they recapitulate the key steps of the viral life cycle, these models do not, however, lend themselves to direct assessment of cccDNA, which have traditionally required detection of cccDNA in the liver.

Dr. Raymond T. Chung
Xu et al. have now developed a novel mouse model in which generation of HBsAg is directly dependent on generation of cccDNA. This dependence thus yields a simple marker for assessment of cccDNA status and allows monitoring of the therapeutic effects of novel agents targeting cccDNA by simply following HBsAg titers. More studies are required to explore the mechanisms underlying HBV cccDNA formation and elimination, but this work suggests a new way forward to tractably evaluate agents that specifically interrupt cccDNA metabolism, an important step in our systematic march toward HBV cure.
 

Raymond T. Chung, MD, is a professor of medicine at Harvard Medical School and director of the Hepatology and Liver Center at Massachusetts General Hospital, both in Boston. He has no conflicts to disclose.

Publications
Topics
Sections
Body

 

On the heels of the wondrous development of curative antiviral agents for hepatitis C virus (HCV), renewed attention has been directed to efforts to bring about the cure of HBV. However, this task will hinge on successful elimination of covalently closed circular DNA (cccDNA), a highly stable form of viral DNA that is exceedingly difficult to eliminate. Efforts to develop successful curative strategies will in turn rely on development of small animal models that support HBV cccDNA formation and virus production, which has until recently proved elusive. In the past several years, several mouse HBV models supporting cccDNA formation have been constructed using adeno-associated vector (AAV)–mediated transduction of a linearized HBV genome. Both the AAV-HBV linear episome and cccDNA have been consistently replicated and detected in these models. While they recapitulate the key steps of the viral life cycle, these models do not, however, lend themselves to direct assessment of cccDNA, which have traditionally required detection of cccDNA in the liver.

Dr. Raymond T. Chung
Xu et al. have now developed a novel mouse model in which generation of HBsAg is directly dependent on generation of cccDNA. This dependence thus yields a simple marker for assessment of cccDNA status and allows monitoring of the therapeutic effects of novel agents targeting cccDNA by simply following HBsAg titers. More studies are required to explore the mechanisms underlying HBV cccDNA formation and elimination, but this work suggests a new way forward to tractably evaluate agents that specifically interrupt cccDNA metabolism, an important step in our systematic march toward HBV cure.
 

Raymond T. Chung, MD, is a professor of medicine at Harvard Medical School and director of the Hepatology and Liver Center at Massachusetts General Hospital, both in Boston. He has no conflicts to disclose.

Body

 

On the heels of the wondrous development of curative antiviral agents for hepatitis C virus (HCV), renewed attention has been directed to efforts to bring about the cure of HBV. However, this task will hinge on successful elimination of covalently closed circular DNA (cccDNA), a highly stable form of viral DNA that is exceedingly difficult to eliminate. Efforts to develop successful curative strategies will in turn rely on development of small animal models that support HBV cccDNA formation and virus production, which has until recently proved elusive. In the past several years, several mouse HBV models supporting cccDNA formation have been constructed using adeno-associated vector (AAV)–mediated transduction of a linearized HBV genome. Both the AAV-HBV linear episome and cccDNA have been consistently replicated and detected in these models. While they recapitulate the key steps of the viral life cycle, these models do not, however, lend themselves to direct assessment of cccDNA, which have traditionally required detection of cccDNA in the liver.

Dr. Raymond T. Chung
Xu et al. have now developed a novel mouse model in which generation of HBsAg is directly dependent on generation of cccDNA. This dependence thus yields a simple marker for assessment of cccDNA status and allows monitoring of the therapeutic effects of novel agents targeting cccDNA by simply following HBsAg titers. More studies are required to explore the mechanisms underlying HBV cccDNA formation and elimination, but this work suggests a new way forward to tractably evaluate agents that specifically interrupt cccDNA metabolism, an important step in our systematic march toward HBV cure.
 

Raymond T. Chung, MD, is a professor of medicine at Harvard Medical School and director of the Hepatology and Liver Center at Massachusetts General Hospital, both in Boston. He has no conflicts to disclose.

Title
Long–sought-after breakthrough?
Long–sought-after breakthrough?

A new mouse model that better represents chronic infection with hepatitis B virus (HBV) in humans may lead to more effective antiviral therapies for HBV, according to investigators.

During human infection, HBV genomes take the form of covalently closed circular DNA (cccDNA), a structure that has thwarted effective antiviral therapy and, until now, creation of an accurate mouse model, reported lead author Zaichao Xu, PhD, of Wuhan (China) University and colleagues.

“As the viral persistence reservoir plays a central role in HBV infection, HBV cccDNA is the key obstacle for a cure,” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology.

Although several previous mouse models have approximated this phenomenon with recombinant cccDNA-like molecules (rcccDNA), the present model is the first to achieve genuine cccDNA, which does not naturally occur in mice.

“Although rcccDNA supports persistent viral replication and antigen expression, the nature of rcccDNA may differ from authentic cccDNA, as additional sequences, like LoxP or attR, were inserted into the HBV genome,” the investigators noted.

The new model was created by first constructing an adeno-associated virus vector carrying a replication-deficient HBV1.04-fold genome (AAV-HBV1.04). When injected into mice, the vector led to cccDNA formation via ataxia-telangiectasia and Rad3-related protein (ATR)–mediated DNA damage response, a finding that was confirmed by blocking the same process with ATR inhibitors.

Immediately after injection, mice tested positive for both hepatitis B e antigen (HBeAg) and hepatitis B surface antigen (HBsAg), with peak concentrations after either 4 or 8 weeks depending on dose. HBV DNA was also detected in serum after injection, and 50% of hepatocytes exhibited HBsAg and hepatitis B core protein (HBc) after 1 week. At week 66, HBsAg, HBeAg, and HBc were still detectable in the liver.

“The expression of HBc could only be observed in the liver, but not in other organs or tissues, suggesting that the AAV-HBV1.04 only targeted the mouse liver,” the investigators wrote.

Further experimentation involving known cccDNA-binding proteins supported the similarity between cccDNA in the mouse model and natural infection.

“These results suggested that the chromatinization and transcriptional activation of cccDNA formed in this model dose not differ from wild-type cccDNA formed through infection.”

Next, Dr. Xu and colleagues demonstrated that the infected mice could serve as a reliable model for antiviral research. One week after injection with the vector, mice were treated with entecavir, polyinosinic-polycytidylic acid (poly[I:C]), or phosphate-buffered saline (PBS; control). As anticipated, entecavir suppressed circulating HBV DNA, but not HBsAg, HBeAg, or HBV cccDNA, whereas treatment with poly(I:C) reduced all HBV markers.

“This novel mouse model will provide a unique platform for studying HBV cccDNA and developing novel antivirals to achieve HBV cure,” the investigators concluded.

The study was supported by the National Natural Science Foundation of China, the Fundamental Research Funds for the Central Universities, Hubei Province’s Outstanding Medical Academic Leader Program, and others. The investigators reported no conflicts of interest.

A new mouse model that better represents chronic infection with hepatitis B virus (HBV) in humans may lead to more effective antiviral therapies for HBV, according to investigators.

During human infection, HBV genomes take the form of covalently closed circular DNA (cccDNA), a structure that has thwarted effective antiviral therapy and, until now, creation of an accurate mouse model, reported lead author Zaichao Xu, PhD, of Wuhan (China) University and colleagues.

“As the viral persistence reservoir plays a central role in HBV infection, HBV cccDNA is the key obstacle for a cure,” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology.

Although several previous mouse models have approximated this phenomenon with recombinant cccDNA-like molecules (rcccDNA), the present model is the first to achieve genuine cccDNA, which does not naturally occur in mice.

“Although rcccDNA supports persistent viral replication and antigen expression, the nature of rcccDNA may differ from authentic cccDNA, as additional sequences, like LoxP or attR, were inserted into the HBV genome,” the investigators noted.

The new model was created by first constructing an adeno-associated virus vector carrying a replication-deficient HBV1.04-fold genome (AAV-HBV1.04). When injected into mice, the vector led to cccDNA formation via ataxia-telangiectasia and Rad3-related protein (ATR)–mediated DNA damage response, a finding that was confirmed by blocking the same process with ATR inhibitors.

Immediately after injection, mice tested positive for both hepatitis B e antigen (HBeAg) and hepatitis B surface antigen (HBsAg), with peak concentrations after either 4 or 8 weeks depending on dose. HBV DNA was also detected in serum after injection, and 50% of hepatocytes exhibited HBsAg and hepatitis B core protein (HBc) after 1 week. At week 66, HBsAg, HBeAg, and HBc were still detectable in the liver.

“The expression of HBc could only be observed in the liver, but not in other organs or tissues, suggesting that the AAV-HBV1.04 only targeted the mouse liver,” the investigators wrote.

Further experimentation involving known cccDNA-binding proteins supported the similarity between cccDNA in the mouse model and natural infection.

“These results suggested that the chromatinization and transcriptional activation of cccDNA formed in this model dose not differ from wild-type cccDNA formed through infection.”

Next, Dr. Xu and colleagues demonstrated that the infected mice could serve as a reliable model for antiviral research. One week after injection with the vector, mice were treated with entecavir, polyinosinic-polycytidylic acid (poly[I:C]), or phosphate-buffered saline (PBS; control). As anticipated, entecavir suppressed circulating HBV DNA, but not HBsAg, HBeAg, or HBV cccDNA, whereas treatment with poly(I:C) reduced all HBV markers.

“This novel mouse model will provide a unique platform for studying HBV cccDNA and developing novel antivirals to achieve HBV cure,” the investigators concluded.

The study was supported by the National Natural Science Foundation of China, the Fundamental Research Funds for the Central Universities, Hubei Province’s Outstanding Medical Academic Leader Program, and others. The investigators reported no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Bowel ultrasound may overtake colonoscopy in Crohn’s

A 'significant financial burden' avoided
Article Type
Changed
Mon, 04/11/2022 - 16:16

Bowel ultrasound predicts the clinical course of Crohn’s disease for up to 1 year, according to results of a prospective trial involving 225 patients.

After additional confirmation in larger studies, ultrasound could serve as a noninvasive alternative to colonoscopy for monitoring and predicting disease course, reported lead author Mariangela Allocca, MD, PhD, of Humanitas University, Milan, and colleagues.

“Frequent colonoscopies are expensive, invasive, and not well tolerated by patients, thus noninvasive tools for assessment and monitoring are strongly needed,” the investigators wrote in Clinical Gastroenterology and Hepatology. “Bowel ultrasound accurately detects inflammatory bowel disease activity, extent, and complications, particularly in Crohn’s disease. Considering its low cost, minimal invasiveness ... and easy repeatability, bowel ultrasound may be a simple, readily available tool for assessing and monitoring Crohn’s disease.”

To test this hypothesis, Dr. Allocca and colleagues enrolled 225 consecutive patients with ileal and/or colonic Crohn’s disease diagnosed for at least 6 months and managed at a tertiary hospital in Italy. All patients underwent both colonoscopy and bowel ultrasound with no more than 3 months between each procedure.

Colonoscopy results were characterized by the Simplified Endoscopic Score for Crohn’s disease (SES-CD), whereas ultrasound was scored using a several parameters, including bowel wall pattern, bowel thickness, bowel wall flow, presence of complications (abscess, fistula, stricture), and characteristics of mesenteric lymph nodes and tissue. Ultrasound scores were considered high if they exceeded a cut-off of 3.52, which was determined by a receiver operating characteristic curve analysis.

Participants were followed for 12 months after baseline ultrasound. The primary objective was to determine the relationship between baseline ultrasound findings and negative disease course, defined by steroid usage, need for surgery, need for hospitalization, and/or change in therapy. The secondary objective was to understand the relationship between ultrasound findings and endoscopy activity.

Multivariable analysis revealed that ultrasound scores greater than 3.52 predicted a negative clinical disease course for up to one year (odds ratio, 6.97; 95% confidence interval, 2.87-16.93; P < .001), as did the presence of at least one disease complication at baseline (OR, 3.90; 95% CI, 1.21-12.53; P = 0.21). A worse clinical course at one-year was also predicted by a baseline fecal calprotectin value of at least 250 mcg/g (OR, 5.43; 95% CI, 2.25-13.11; P < .001) and male sex (OR, 2.60; 95% CI, 1.12-6.02; P = .025).

Investigators then assessed individual disease outcomes at 12 months and baseline results. For example, high ultrasound score and calprotectin at baseline each predicted the need for treatment escalation. In comparison, disease behavior (inflammatory, stricturing, penetrating) and C reactive protein predicted need for corticosteroids. The only significant predictor of hospitalization a year later was CRP.

“[B]owel ultrasound is able to predict disease course in Crohn’s disease patients,” they wrote. “It may identify patients at high risk of a negative course to adopt effective strategies to prevent any disease progression. Our data need to be confirmed and validated in further large studies.”

The investigators disclosed relationships with Janssen, AbbVie, Mundipharma, and others.

Body

Patients with Crohn’s disease (CD) undergo multiple colonoscopies during their lifetime. Endoscopic assessment is often necessary to determine extent and severity of inflammation to guide choice of therapy, assess mucosal healing on current therapy, and for surveillance examination for colorectal dysplasia. Multiple colonoscopies over a lifetime present a significant financial burden for patients. The invasive nature of the procedure, along with the small but potential risk of perforation and patient discomfort make for an undesirable experience. Cross-sectional imaging offers the advantage of noninvasive modality to assess bowel wall and extraluminal complications related to CD. Bowel ultrasound, performed as point of care imaging by gastroenterologists, is an emerging imaging alternative to visualize the bowel.

In the study by Allocca et al., the authors developed a bowel ultrasound–based score incorporating bowel wall thickness, pattern, flow, and presence of extraluminal complications. The score was developed by comparing ultrasound parameters with colonoscopy findings for each segment of the colon and terminal ileum. In a cohort of 225 patients, a bowel ultrasound score of >3.52 along with at least one extraluminal complication, baseline fecal calprotectin of >250 mcg/g, and male gender were linked with adverse outcomes within 12 months (defined as need for steroids, change of therapy, hospitalization, or surgery).

Dr. Manreet Kaur

While these observations need to be validated externally, this study further consolidates the role for bowel ultrasound as a viable imaging modality to monitor disease and response to therapy in CD. Prior studies have shown bowel ultrasound is a valid alternative to MR enterography – without the expense, limited availability, and need for gadolinium contrast. As the therapeutic targets in IBD move toward mucosa healing, bowel ultrasound offers the promise of a cost-effective, noninvasive, point-of care test that can be performed during an office consultation. The operator dependent nature of this modality may limit its uptake and utilization. The International Bowel Ultrasound Group (IBUS) has collaborated with the European Crohn’s and Colitis organization as well as the Canadian Association of Gastroenterology to establish training and research in bowel ultrasound. Soon, patients can expect a bowel ultrasound to become part of their routine assessment during an office consultation. 

Manreet Kaur, MD, is medical director of the Inflammatory Bowel Disease Center and an associate professor in the division of gastroenterology and hepatology at Baylor College of Medicine, Houston. She has no relevant conflicts of interest.
 

Publications
Topics
Sections
Body

Patients with Crohn’s disease (CD) undergo multiple colonoscopies during their lifetime. Endoscopic assessment is often necessary to determine extent and severity of inflammation to guide choice of therapy, assess mucosal healing on current therapy, and for surveillance examination for colorectal dysplasia. Multiple colonoscopies over a lifetime present a significant financial burden for patients. The invasive nature of the procedure, along with the small but potential risk of perforation and patient discomfort make for an undesirable experience. Cross-sectional imaging offers the advantage of noninvasive modality to assess bowel wall and extraluminal complications related to CD. Bowel ultrasound, performed as point of care imaging by gastroenterologists, is an emerging imaging alternative to visualize the bowel.

In the study by Allocca et al., the authors developed a bowel ultrasound–based score incorporating bowel wall thickness, pattern, flow, and presence of extraluminal complications. The score was developed by comparing ultrasound parameters with colonoscopy findings for each segment of the colon and terminal ileum. In a cohort of 225 patients, a bowel ultrasound score of >3.52 along with at least one extraluminal complication, baseline fecal calprotectin of >250 mcg/g, and male gender were linked with adverse outcomes within 12 months (defined as need for steroids, change of therapy, hospitalization, or surgery).

Dr. Manreet Kaur

While these observations need to be validated externally, this study further consolidates the role for bowel ultrasound as a viable imaging modality to monitor disease and response to therapy in CD. Prior studies have shown bowel ultrasound is a valid alternative to MR enterography – without the expense, limited availability, and need for gadolinium contrast. As the therapeutic targets in IBD move toward mucosa healing, bowel ultrasound offers the promise of a cost-effective, noninvasive, point-of care test that can be performed during an office consultation. The operator dependent nature of this modality may limit its uptake and utilization. The International Bowel Ultrasound Group (IBUS) has collaborated with the European Crohn’s and Colitis organization as well as the Canadian Association of Gastroenterology to establish training and research in bowel ultrasound. Soon, patients can expect a bowel ultrasound to become part of their routine assessment during an office consultation. 

Manreet Kaur, MD, is medical director of the Inflammatory Bowel Disease Center and an associate professor in the division of gastroenterology and hepatology at Baylor College of Medicine, Houston. She has no relevant conflicts of interest.
 

Body

Patients with Crohn’s disease (CD) undergo multiple colonoscopies during their lifetime. Endoscopic assessment is often necessary to determine extent and severity of inflammation to guide choice of therapy, assess mucosal healing on current therapy, and for surveillance examination for colorectal dysplasia. Multiple colonoscopies over a lifetime present a significant financial burden for patients. The invasive nature of the procedure, along with the small but potential risk of perforation and patient discomfort make for an undesirable experience. Cross-sectional imaging offers the advantage of noninvasive modality to assess bowel wall and extraluminal complications related to CD. Bowel ultrasound, performed as point of care imaging by gastroenterologists, is an emerging imaging alternative to visualize the bowel.

In the study by Allocca et al., the authors developed a bowel ultrasound–based score incorporating bowel wall thickness, pattern, flow, and presence of extraluminal complications. The score was developed by comparing ultrasound parameters with colonoscopy findings for each segment of the colon and terminal ileum. In a cohort of 225 patients, a bowel ultrasound score of >3.52 along with at least one extraluminal complication, baseline fecal calprotectin of >250 mcg/g, and male gender were linked with adverse outcomes within 12 months (defined as need for steroids, change of therapy, hospitalization, or surgery).

Dr. Manreet Kaur

While these observations need to be validated externally, this study further consolidates the role for bowel ultrasound as a viable imaging modality to monitor disease and response to therapy in CD. Prior studies have shown bowel ultrasound is a valid alternative to MR enterography – without the expense, limited availability, and need for gadolinium contrast. As the therapeutic targets in IBD move toward mucosa healing, bowel ultrasound offers the promise of a cost-effective, noninvasive, point-of care test that can be performed during an office consultation. The operator dependent nature of this modality may limit its uptake and utilization. The International Bowel Ultrasound Group (IBUS) has collaborated with the European Crohn’s and Colitis organization as well as the Canadian Association of Gastroenterology to establish training and research in bowel ultrasound. Soon, patients can expect a bowel ultrasound to become part of their routine assessment during an office consultation. 

Manreet Kaur, MD, is medical director of the Inflammatory Bowel Disease Center and an associate professor in the division of gastroenterology and hepatology at Baylor College of Medicine, Houston. She has no relevant conflicts of interest.
 

Title
A 'significant financial burden' avoided
A 'significant financial burden' avoided

Bowel ultrasound predicts the clinical course of Crohn’s disease for up to 1 year, according to results of a prospective trial involving 225 patients.

After additional confirmation in larger studies, ultrasound could serve as a noninvasive alternative to colonoscopy for monitoring and predicting disease course, reported lead author Mariangela Allocca, MD, PhD, of Humanitas University, Milan, and colleagues.

“Frequent colonoscopies are expensive, invasive, and not well tolerated by patients, thus noninvasive tools for assessment and monitoring are strongly needed,” the investigators wrote in Clinical Gastroenterology and Hepatology. “Bowel ultrasound accurately detects inflammatory bowel disease activity, extent, and complications, particularly in Crohn’s disease. Considering its low cost, minimal invasiveness ... and easy repeatability, bowel ultrasound may be a simple, readily available tool for assessing and monitoring Crohn’s disease.”

To test this hypothesis, Dr. Allocca and colleagues enrolled 225 consecutive patients with ileal and/or colonic Crohn’s disease diagnosed for at least 6 months and managed at a tertiary hospital in Italy. All patients underwent both colonoscopy and bowel ultrasound with no more than 3 months between each procedure.

Colonoscopy results were characterized by the Simplified Endoscopic Score for Crohn’s disease (SES-CD), whereas ultrasound was scored using a several parameters, including bowel wall pattern, bowel thickness, bowel wall flow, presence of complications (abscess, fistula, stricture), and characteristics of mesenteric lymph nodes and tissue. Ultrasound scores were considered high if they exceeded a cut-off of 3.52, which was determined by a receiver operating characteristic curve analysis.

Participants were followed for 12 months after baseline ultrasound. The primary objective was to determine the relationship between baseline ultrasound findings and negative disease course, defined by steroid usage, need for surgery, need for hospitalization, and/or change in therapy. The secondary objective was to understand the relationship between ultrasound findings and endoscopy activity.

Multivariable analysis revealed that ultrasound scores greater than 3.52 predicted a negative clinical disease course for up to one year (odds ratio, 6.97; 95% confidence interval, 2.87-16.93; P < .001), as did the presence of at least one disease complication at baseline (OR, 3.90; 95% CI, 1.21-12.53; P = 0.21). A worse clinical course at one-year was also predicted by a baseline fecal calprotectin value of at least 250 mcg/g (OR, 5.43; 95% CI, 2.25-13.11; P < .001) and male sex (OR, 2.60; 95% CI, 1.12-6.02; P = .025).

Investigators then assessed individual disease outcomes at 12 months and baseline results. For example, high ultrasound score and calprotectin at baseline each predicted the need for treatment escalation. In comparison, disease behavior (inflammatory, stricturing, penetrating) and C reactive protein predicted need for corticosteroids. The only significant predictor of hospitalization a year later was CRP.

“[B]owel ultrasound is able to predict disease course in Crohn’s disease patients,” they wrote. “It may identify patients at high risk of a negative course to adopt effective strategies to prevent any disease progression. Our data need to be confirmed and validated in further large studies.”

The investigators disclosed relationships with Janssen, AbbVie, Mundipharma, and others.

Bowel ultrasound predicts the clinical course of Crohn’s disease for up to 1 year, according to results of a prospective trial involving 225 patients.

After additional confirmation in larger studies, ultrasound could serve as a noninvasive alternative to colonoscopy for monitoring and predicting disease course, reported lead author Mariangela Allocca, MD, PhD, of Humanitas University, Milan, and colleagues.

“Frequent colonoscopies are expensive, invasive, and not well tolerated by patients, thus noninvasive tools for assessment and monitoring are strongly needed,” the investigators wrote in Clinical Gastroenterology and Hepatology. “Bowel ultrasound accurately detects inflammatory bowel disease activity, extent, and complications, particularly in Crohn’s disease. Considering its low cost, minimal invasiveness ... and easy repeatability, bowel ultrasound may be a simple, readily available tool for assessing and monitoring Crohn’s disease.”

To test this hypothesis, Dr. Allocca and colleagues enrolled 225 consecutive patients with ileal and/or colonic Crohn’s disease diagnosed for at least 6 months and managed at a tertiary hospital in Italy. All patients underwent both colonoscopy and bowel ultrasound with no more than 3 months between each procedure.

Colonoscopy results were characterized by the Simplified Endoscopic Score for Crohn’s disease (SES-CD), whereas ultrasound was scored using a several parameters, including bowel wall pattern, bowel thickness, bowel wall flow, presence of complications (abscess, fistula, stricture), and characteristics of mesenteric lymph nodes and tissue. Ultrasound scores were considered high if they exceeded a cut-off of 3.52, which was determined by a receiver operating characteristic curve analysis.

Participants were followed for 12 months after baseline ultrasound. The primary objective was to determine the relationship between baseline ultrasound findings and negative disease course, defined by steroid usage, need for surgery, need for hospitalization, and/or change in therapy. The secondary objective was to understand the relationship between ultrasound findings and endoscopy activity.

Multivariable analysis revealed that ultrasound scores greater than 3.52 predicted a negative clinical disease course for up to one year (odds ratio, 6.97; 95% confidence interval, 2.87-16.93; P < .001), as did the presence of at least one disease complication at baseline (OR, 3.90; 95% CI, 1.21-12.53; P = 0.21). A worse clinical course at one-year was also predicted by a baseline fecal calprotectin value of at least 250 mcg/g (OR, 5.43; 95% CI, 2.25-13.11; P < .001) and male sex (OR, 2.60; 95% CI, 1.12-6.02; P = .025).

Investigators then assessed individual disease outcomes at 12 months and baseline results. For example, high ultrasound score and calprotectin at baseline each predicted the need for treatment escalation. In comparison, disease behavior (inflammatory, stricturing, penetrating) and C reactive protein predicted need for corticosteroids. The only significant predictor of hospitalization a year later was CRP.

“[B]owel ultrasound is able to predict disease course in Crohn’s disease patients,” they wrote. “It may identify patients at high risk of a negative course to adopt effective strategies to prevent any disease progression. Our data need to be confirmed and validated in further large studies.”

The investigators disclosed relationships with Janssen, AbbVie, Mundipharma, and others.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article