User login
Complete endoscopic healing tied to better Crohn’s disease outcomes
Patients with Crohn’s disease who experienced complete endoscopic healing with biologic therapy had significantly lower subsequent rates of treatment failure, intestinal resection, and hospitalization, compared with patients who experienced only partial mucosal healing, according to the findings of a two-center retrospective study.
Over a median of 4.8 years of follow-up (interquartile range, 2.1-7.2 years) rates of treatment failure were 25% in patients with complete mucosal healing at baseline (that is, a Crohn’s Disease Endoscopic Index of Severity [CDEIS] score of 0) and 48% in patients with partial healing (CDEIS score greater than 0 but less than 4). The difference was statistically significant (P = .045). No patient with a baseline CDEIS score of 0 required intestinal resection, compared with 11% of patients with scores greater than 0 but less than 4 (P = .031). Rates of hospitalization because of Crohn’s disease were 3.5% versus 18.5%, respectively (P = .013). Clara Yzet, MD, of Amiens (France) University Hospital, Université de Picardie Jules Verne, reported the findings together with her associates in Clinical Gastroenterology and Hepatology.
Mucosal healing is a key therapeutic target in Crohn’s disease that has been linked to desirable outcomes, such as steroid-free remission and a lower rate of intestinal resection. However, prior observational studies have inconsistently defined mucosal healing, and clinical trials of biologics have used any of at least seven different definitions, the researchers wrote. Recently, in patients with ulcerative colitis, a Scandinavian Journal of Gastroenterology and another in the Journal of Crohn’s and Colitis linked a stricter definition of mucosal healing (an endoscopic Mayo score of 0, or histologic healing) with superior long-term clinical outcomes. In patients with Crohn’s disease, however, there has been no established threshold for mucosal healing based on either the CDEIS or the Simplified Endoscopic Score for Crohn’s disease (SES-CD).
Therefore, Dr. Yzet and her associates identified and reviewed the medical records of 84 consecutive adults with clinically remitted Crohn’s disease who received anti–tumor necrosis factor therapies (infliximab and adalimumab) or vedolizumab at two university hospitals in France between 2008 and 2015. All patients received baseline and follow-up colonoscopies, with results scored on the CDEIS. In all cases, the second CDEIS score was less than 4 (the CDEIS ranges from 0 to 44). The primary outcome was treatment failure, defined as the need for biologic optimization (increasing the dose or shortening the dosing interval of the biologic), corticosteroids, or immunosuppressants; a Harvey-Bradshaw score greater than 4 associated with a change in treatment; or the need for intestinal resection or hospitalization because of Crohn’s disease.
At baseline, 57 patients had CDEIS scores of 0 (complete mucosal healing) and 27 patients had scores greater than 0 but less than 4 (partial mucosal healing). The two groups were otherwise similar except that patients with complete mucosal healing had a shorter median duration of Crohn’s disease (10.3 vs. 15.1 years in the partial healing group; P = .029) and a lower prevalence of Crohn’s disease phenotype B2 (stricturing) according to the Vienna classification (1.8% vs. 14.8%; P = .035). In the multivariate analysis, CDEIS score was the only factor associated with treatment failure (hazard ratio, 2.61; 95% confidence interval, 1.16-5.88; P = .02).
“Our findings suggest that we should be more ambitious in clinical practice to change patients’ life and disease course by achieving complete endoscopic healing. However, this strategy could be limited by the ability of current drugs to achieve complete mucosal healing,” the researchers wrote. “Obtaining a complete mucosal healing would require today a significant need for optimization or change of biologics.”
No external funding sources were reported. Two coinvestigators disclosed ties to AbbVie, Amgen, Biogaran, Biogen, Ferring, Janssen, MSD, Pfizer, Takeda, and several other pharmaceutical companies. The remaining investigators reported having no relevant conflicts of interest.
SOURCE: Yzet C et al. Clin Gastroenterol Hepatol. 2019 Nov 16. doi: 10.1016/j.cgh.2019.11.025.
Patients with Crohn’s disease who experienced complete endoscopic healing with biologic therapy had significantly lower subsequent rates of treatment failure, intestinal resection, and hospitalization, compared with patients who experienced only partial mucosal healing, according to the findings of a two-center retrospective study.
Over a median of 4.8 years of follow-up (interquartile range, 2.1-7.2 years) rates of treatment failure were 25% in patients with complete mucosal healing at baseline (that is, a Crohn’s Disease Endoscopic Index of Severity [CDEIS] score of 0) and 48% in patients with partial healing (CDEIS score greater than 0 but less than 4). The difference was statistically significant (P = .045). No patient with a baseline CDEIS score of 0 required intestinal resection, compared with 11% of patients with scores greater than 0 but less than 4 (P = .031). Rates of hospitalization because of Crohn’s disease were 3.5% versus 18.5%, respectively (P = .013). Clara Yzet, MD, of Amiens (France) University Hospital, Université de Picardie Jules Verne, reported the findings together with her associates in Clinical Gastroenterology and Hepatology.
Mucosal healing is a key therapeutic target in Crohn’s disease that has been linked to desirable outcomes, such as steroid-free remission and a lower rate of intestinal resection. However, prior observational studies have inconsistently defined mucosal healing, and clinical trials of biologics have used any of at least seven different definitions, the researchers wrote. Recently, in patients with ulcerative colitis, a Scandinavian Journal of Gastroenterology and another in the Journal of Crohn’s and Colitis linked a stricter definition of mucosal healing (an endoscopic Mayo score of 0, or histologic healing) with superior long-term clinical outcomes. In patients with Crohn’s disease, however, there has been no established threshold for mucosal healing based on either the CDEIS or the Simplified Endoscopic Score for Crohn’s disease (SES-CD).
Therefore, Dr. Yzet and her associates identified and reviewed the medical records of 84 consecutive adults with clinically remitted Crohn’s disease who received anti–tumor necrosis factor therapies (infliximab and adalimumab) or vedolizumab at two university hospitals in France between 2008 and 2015. All patients received baseline and follow-up colonoscopies, with results scored on the CDEIS. In all cases, the second CDEIS score was less than 4 (the CDEIS ranges from 0 to 44). The primary outcome was treatment failure, defined as the need for biologic optimization (increasing the dose or shortening the dosing interval of the biologic), corticosteroids, or immunosuppressants; a Harvey-Bradshaw score greater than 4 associated with a change in treatment; or the need for intestinal resection or hospitalization because of Crohn’s disease.
At baseline, 57 patients had CDEIS scores of 0 (complete mucosal healing) and 27 patients had scores greater than 0 but less than 4 (partial mucosal healing). The two groups were otherwise similar except that patients with complete mucosal healing had a shorter median duration of Crohn’s disease (10.3 vs. 15.1 years in the partial healing group; P = .029) and a lower prevalence of Crohn’s disease phenotype B2 (stricturing) according to the Vienna classification (1.8% vs. 14.8%; P = .035). In the multivariate analysis, CDEIS score was the only factor associated with treatment failure (hazard ratio, 2.61; 95% confidence interval, 1.16-5.88; P = .02).
“Our findings suggest that we should be more ambitious in clinical practice to change patients’ life and disease course by achieving complete endoscopic healing. However, this strategy could be limited by the ability of current drugs to achieve complete mucosal healing,” the researchers wrote. “Obtaining a complete mucosal healing would require today a significant need for optimization or change of biologics.”
No external funding sources were reported. Two coinvestigators disclosed ties to AbbVie, Amgen, Biogaran, Biogen, Ferring, Janssen, MSD, Pfizer, Takeda, and several other pharmaceutical companies. The remaining investigators reported having no relevant conflicts of interest.
SOURCE: Yzet C et al. Clin Gastroenterol Hepatol. 2019 Nov 16. doi: 10.1016/j.cgh.2019.11.025.
Patients with Crohn’s disease who experienced complete endoscopic healing with biologic therapy had significantly lower subsequent rates of treatment failure, intestinal resection, and hospitalization, compared with patients who experienced only partial mucosal healing, according to the findings of a two-center retrospective study.
Over a median of 4.8 years of follow-up (interquartile range, 2.1-7.2 years) rates of treatment failure were 25% in patients with complete mucosal healing at baseline (that is, a Crohn’s Disease Endoscopic Index of Severity [CDEIS] score of 0) and 48% in patients with partial healing (CDEIS score greater than 0 but less than 4). The difference was statistically significant (P = .045). No patient with a baseline CDEIS score of 0 required intestinal resection, compared with 11% of patients with scores greater than 0 but less than 4 (P = .031). Rates of hospitalization because of Crohn’s disease were 3.5% versus 18.5%, respectively (P = .013). Clara Yzet, MD, of Amiens (France) University Hospital, Université de Picardie Jules Verne, reported the findings together with her associates in Clinical Gastroenterology and Hepatology.
Mucosal healing is a key therapeutic target in Crohn’s disease that has been linked to desirable outcomes, such as steroid-free remission and a lower rate of intestinal resection. However, prior observational studies have inconsistently defined mucosal healing, and clinical trials of biologics have used any of at least seven different definitions, the researchers wrote. Recently, in patients with ulcerative colitis, a Scandinavian Journal of Gastroenterology and another in the Journal of Crohn’s and Colitis linked a stricter definition of mucosal healing (an endoscopic Mayo score of 0, or histologic healing) with superior long-term clinical outcomes. In patients with Crohn’s disease, however, there has been no established threshold for mucosal healing based on either the CDEIS or the Simplified Endoscopic Score for Crohn’s disease (SES-CD).
Therefore, Dr. Yzet and her associates identified and reviewed the medical records of 84 consecutive adults with clinically remitted Crohn’s disease who received anti–tumor necrosis factor therapies (infliximab and adalimumab) or vedolizumab at two university hospitals in France between 2008 and 2015. All patients received baseline and follow-up colonoscopies, with results scored on the CDEIS. In all cases, the second CDEIS score was less than 4 (the CDEIS ranges from 0 to 44). The primary outcome was treatment failure, defined as the need for biologic optimization (increasing the dose or shortening the dosing interval of the biologic), corticosteroids, or immunosuppressants; a Harvey-Bradshaw score greater than 4 associated with a change in treatment; or the need for intestinal resection or hospitalization because of Crohn’s disease.
At baseline, 57 patients had CDEIS scores of 0 (complete mucosal healing) and 27 patients had scores greater than 0 but less than 4 (partial mucosal healing). The two groups were otherwise similar except that patients with complete mucosal healing had a shorter median duration of Crohn’s disease (10.3 vs. 15.1 years in the partial healing group; P = .029) and a lower prevalence of Crohn’s disease phenotype B2 (stricturing) according to the Vienna classification (1.8% vs. 14.8%; P = .035). In the multivariate analysis, CDEIS score was the only factor associated with treatment failure (hazard ratio, 2.61; 95% confidence interval, 1.16-5.88; P = .02).
“Our findings suggest that we should be more ambitious in clinical practice to change patients’ life and disease course by achieving complete endoscopic healing. However, this strategy could be limited by the ability of current drugs to achieve complete mucosal healing,” the researchers wrote. “Obtaining a complete mucosal healing would require today a significant need for optimization or change of biologics.”
No external funding sources were reported. Two coinvestigators disclosed ties to AbbVie, Amgen, Biogaran, Biogen, Ferring, Janssen, MSD, Pfizer, Takeda, and several other pharmaceutical companies. The remaining investigators reported having no relevant conflicts of interest.
SOURCE: Yzet C et al. Clin Gastroenterol Hepatol. 2019 Nov 16. doi: 10.1016/j.cgh.2019.11.025.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Studies eyes risks for poor outcomes in primary sclerosing cholangitis
In individuals with inflammatory bowel disease and primary sclerosing cholangitis, younger age at diagnosis, male sex, and Afro-Caribbean heritage were significant risk factors for liver transplantation and disease-related death, based on a 10-year prospective population-based study.
These factors should be incorporated into the design of clinical trials, models for predicting disease, and studies of prognostic biomarkers for primary sclerosing cholangitis, Palak T. Trivedi, MBBS, MRCP, of the Universty of Birmingham (England) wrote with his associates in Gastroenterology.
The researchers identified newly diagnosed cases from a national health care registry in England between 2006 and 2016 (data on outcomes were collected through mid-2019). In all, 284,560 individuals had a new diagnosis of inflammatory bowel disease, among whom 2,588 also had primary sclerosing cholangitis. The investigators tracked deaths, liver transplantation, colonic resection, cholecystectomy, and diagnoses of colorectal cancer, cholangiosarcoma, and cancers of the pancreas, gallbladder, and liver. They evaluated rates of these outcomes among individuals with both primary sclerosing cholangitis and inflammatory bowel disease (PSC-IBD) and those with IBD only.
After controlling for sex, race, socioeconomic level, comorbidities, and older age, the researchers found that both men and women with PSC-IBD had a significantly greater risk for all-cause mortality, compared with individuals with IBD alone (hazard ratio, 3.20; 95% confidence interval, 3.01-3.40; P less than .001). Strikingly, individuals who were diagnosed with PSC when they were younger than 40 years had a more than sevenfold higher rate of all-cause mortality, compared with individuals with IBD only. In contrast, the incidence rate ratio for individuals diagnosed with PSC when they were older than 60 years was less than 1.5, compared with IBD-only individuals.
Having PSC and ulcerative colitis, being younger when diagnosed with PSC, and being of Afro-Carribean heritage all correlated with higher incidence of liver transplantation or death related to PSC. Individuals with PSC-IBD who were of Afro-Caribbean heritage had an approximately twofold greater risk for liver transplantation or PSC-related death compared with Whites (adjusted HR, 2.05; 95% CI, 1.14-3.70; P = .016). In contrast, women with PSC-IBD were at significantly lower risk for liver transplantation or disease-related death than were men (adjusted HR, 0.74; 95% CI, 0.57-0.97; P = .026).
“The onset of PSC confers heightened risks of all hepatobiliary malignancies, although annual imaging surveillance may associate with a reduced risk of cancer-related death,” the investigators found. Among patients with hepatobiliary cancer, annual imaging was associated with a twofold decrease in risk for cancer-related death (HR, 0.43; 95% CI, 0.23-0.80; P = .037).
Colorectal cancer tended to occur at a younger age among individuals with PSC-IBD, compared with those with IBD alone (median ages at diagnosis, 59 vs. 69 years; P less than .001). Notably, individuals with PSC diagnosed under age 50 years had about a fivefold higher incidence of colorectal cancer than did those with IBD alone, while those diagnosed at older ages had only about a twofold increase. With regard to colectomy, men diagnosed with PSC at younger ages were at the greatest risk, compared with women or individuals diagnosed after age 50 years. Individuals with ulcerative colitis and PSC had a 40% greater risk for colectomy risk than did IBD-only individuals (time-dependent adjusted HR, 1.65; 95% CI, 1.45-1.85; P less than .001).
“Whilst all-cause mortality rates increase with age, younger patients [with PSC] show a disproportionately increased incidence of liver transplantation, PSC-related death, and colorectal cancer,” the researchers concluded. “Consideration of age at diagnosis should therefore be applied in the stratification of patients for future clinical trials, disease prediction models, and prognostic biomarker discovery.”
Dr. Trivedi disclosed support from the National Institute for Health Research Birmingham Biomedical Research Centre, at the University Hospitals Birmingham NHS Foundation Trust and the University of Birmingham. No other disclosures were reported.
SOURCE: Trivedi PJ et al. Gastroenterology. 2020 May 19. doi: 10.1053/j.gastro.2020.05.049.
In individuals with inflammatory bowel disease and primary sclerosing cholangitis, younger age at diagnosis, male sex, and Afro-Caribbean heritage were significant risk factors for liver transplantation and disease-related death, based on a 10-year prospective population-based study.
These factors should be incorporated into the design of clinical trials, models for predicting disease, and studies of prognostic biomarkers for primary sclerosing cholangitis, Palak T. Trivedi, MBBS, MRCP, of the Universty of Birmingham (England) wrote with his associates in Gastroenterology.
The researchers identified newly diagnosed cases from a national health care registry in England between 2006 and 2016 (data on outcomes were collected through mid-2019). In all, 284,560 individuals had a new diagnosis of inflammatory bowel disease, among whom 2,588 also had primary sclerosing cholangitis. The investigators tracked deaths, liver transplantation, colonic resection, cholecystectomy, and diagnoses of colorectal cancer, cholangiosarcoma, and cancers of the pancreas, gallbladder, and liver. They evaluated rates of these outcomes among individuals with both primary sclerosing cholangitis and inflammatory bowel disease (PSC-IBD) and those with IBD only.
After controlling for sex, race, socioeconomic level, comorbidities, and older age, the researchers found that both men and women with PSC-IBD had a significantly greater risk for all-cause mortality, compared with individuals with IBD alone (hazard ratio, 3.20; 95% confidence interval, 3.01-3.40; P less than .001). Strikingly, individuals who were diagnosed with PSC when they were younger than 40 years had a more than sevenfold higher rate of all-cause mortality, compared with individuals with IBD only. In contrast, the incidence rate ratio for individuals diagnosed with PSC when they were older than 60 years was less than 1.5, compared with IBD-only individuals.
Having PSC and ulcerative colitis, being younger when diagnosed with PSC, and being of Afro-Carribean heritage all correlated with higher incidence of liver transplantation or death related to PSC. Individuals with PSC-IBD who were of Afro-Caribbean heritage had an approximately twofold greater risk for liver transplantation or PSC-related death compared with Whites (adjusted HR, 2.05; 95% CI, 1.14-3.70; P = .016). In contrast, women with PSC-IBD were at significantly lower risk for liver transplantation or disease-related death than were men (adjusted HR, 0.74; 95% CI, 0.57-0.97; P = .026).
“The onset of PSC confers heightened risks of all hepatobiliary malignancies, although annual imaging surveillance may associate with a reduced risk of cancer-related death,” the investigators found. Among patients with hepatobiliary cancer, annual imaging was associated with a twofold decrease in risk for cancer-related death (HR, 0.43; 95% CI, 0.23-0.80; P = .037).
Colorectal cancer tended to occur at a younger age among individuals with PSC-IBD, compared with those with IBD alone (median ages at diagnosis, 59 vs. 69 years; P less than .001). Notably, individuals with PSC diagnosed under age 50 years had about a fivefold higher incidence of colorectal cancer than did those with IBD alone, while those diagnosed at older ages had only about a twofold increase. With regard to colectomy, men diagnosed with PSC at younger ages were at the greatest risk, compared with women or individuals diagnosed after age 50 years. Individuals with ulcerative colitis and PSC had a 40% greater risk for colectomy risk than did IBD-only individuals (time-dependent adjusted HR, 1.65; 95% CI, 1.45-1.85; P less than .001).
“Whilst all-cause mortality rates increase with age, younger patients [with PSC] show a disproportionately increased incidence of liver transplantation, PSC-related death, and colorectal cancer,” the researchers concluded. “Consideration of age at diagnosis should therefore be applied in the stratification of patients for future clinical trials, disease prediction models, and prognostic biomarker discovery.”
Dr. Trivedi disclosed support from the National Institute for Health Research Birmingham Biomedical Research Centre, at the University Hospitals Birmingham NHS Foundation Trust and the University of Birmingham. No other disclosures were reported.
SOURCE: Trivedi PJ et al. Gastroenterology. 2020 May 19. doi: 10.1053/j.gastro.2020.05.049.
In individuals with inflammatory bowel disease and primary sclerosing cholangitis, younger age at diagnosis, male sex, and Afro-Caribbean heritage were significant risk factors for liver transplantation and disease-related death, based on a 10-year prospective population-based study.
These factors should be incorporated into the design of clinical trials, models for predicting disease, and studies of prognostic biomarkers for primary sclerosing cholangitis, Palak T. Trivedi, MBBS, MRCP, of the Universty of Birmingham (England) wrote with his associates in Gastroenterology.
The researchers identified newly diagnosed cases from a national health care registry in England between 2006 and 2016 (data on outcomes were collected through mid-2019). In all, 284,560 individuals had a new diagnosis of inflammatory bowel disease, among whom 2,588 also had primary sclerosing cholangitis. The investigators tracked deaths, liver transplantation, colonic resection, cholecystectomy, and diagnoses of colorectal cancer, cholangiosarcoma, and cancers of the pancreas, gallbladder, and liver. They evaluated rates of these outcomes among individuals with both primary sclerosing cholangitis and inflammatory bowel disease (PSC-IBD) and those with IBD only.
After controlling for sex, race, socioeconomic level, comorbidities, and older age, the researchers found that both men and women with PSC-IBD had a significantly greater risk for all-cause mortality, compared with individuals with IBD alone (hazard ratio, 3.20; 95% confidence interval, 3.01-3.40; P less than .001). Strikingly, individuals who were diagnosed with PSC when they were younger than 40 years had a more than sevenfold higher rate of all-cause mortality, compared with individuals with IBD only. In contrast, the incidence rate ratio for individuals diagnosed with PSC when they were older than 60 years was less than 1.5, compared with IBD-only individuals.
Having PSC and ulcerative colitis, being younger when diagnosed with PSC, and being of Afro-Carribean heritage all correlated with higher incidence of liver transplantation or death related to PSC. Individuals with PSC-IBD who were of Afro-Caribbean heritage had an approximately twofold greater risk for liver transplantation or PSC-related death compared with Whites (adjusted HR, 2.05; 95% CI, 1.14-3.70; P = .016). In contrast, women with PSC-IBD were at significantly lower risk for liver transplantation or disease-related death than were men (adjusted HR, 0.74; 95% CI, 0.57-0.97; P = .026).
“The onset of PSC confers heightened risks of all hepatobiliary malignancies, although annual imaging surveillance may associate with a reduced risk of cancer-related death,” the investigators found. Among patients with hepatobiliary cancer, annual imaging was associated with a twofold decrease in risk for cancer-related death (HR, 0.43; 95% CI, 0.23-0.80; P = .037).
Colorectal cancer tended to occur at a younger age among individuals with PSC-IBD, compared with those with IBD alone (median ages at diagnosis, 59 vs. 69 years; P less than .001). Notably, individuals with PSC diagnosed under age 50 years had about a fivefold higher incidence of colorectal cancer than did those with IBD alone, while those diagnosed at older ages had only about a twofold increase. With regard to colectomy, men diagnosed with PSC at younger ages were at the greatest risk, compared with women or individuals diagnosed after age 50 years. Individuals with ulcerative colitis and PSC had a 40% greater risk for colectomy risk than did IBD-only individuals (time-dependent adjusted HR, 1.65; 95% CI, 1.45-1.85; P less than .001).
“Whilst all-cause mortality rates increase with age, younger patients [with PSC] show a disproportionately increased incidence of liver transplantation, PSC-related death, and colorectal cancer,” the researchers concluded. “Consideration of age at diagnosis should therefore be applied in the stratification of patients for future clinical trials, disease prediction models, and prognostic biomarker discovery.”
Dr. Trivedi disclosed support from the National Institute for Health Research Birmingham Biomedical Research Centre, at the University Hospitals Birmingham NHS Foundation Trust and the University of Birmingham. No other disclosures were reported.
SOURCE: Trivedi PJ et al. Gastroenterology. 2020 May 19. doi: 10.1053/j.gastro.2020.05.049.
FROM GASTROENTEROLOGY
Model identified heavy drinkers at highest risk of ALD progression
In heavy drinkers with alcohol-related liver disease, a Markov model based on age, sex, body mass index, and duration and extent of alcohol use predicted risk for disease progression, researchers reported in Clinical Gastroenterology and Hepatology.
The study included 2,334 hospitalized adults with consistently abnormal liver test results who had consumed at least 50 grams of alcohol (about 3.5-4 drinks) per day for the previous 5 years. The model was developed using data from 1,599 individuals with baseline liver biopsies and validated in 735 individuals with no baseline liver biopsies but available data on the presence or absence of hepatic decompensation.
For a 40-year-old man with F0-F2 fibrosis who had been drinking alcohol for 15 years, who drank 150 grams of alcohol daily, and who had a body mass index (BMI) of 22 kg/m2, the model predicted a 31.8% likelihood of having a normal liver at baseline, a 61.5% probability of baseline steatosis, and a 6.7% probability of baseline steatohepatitis. In women with the same baseline variables, respective probabilities were 25.1%, 66.5%, and 8.4%. Based on these findings, the 5-year weighted risk for liver complications ranged from 0.2% for men with normal initial liver findings to 10.3% for men with baseline steatohepatitis. Among women, the corresponding risk estimates ranged from 0.5% to 14.7%, wrote PhD student Claire Delacôte of Centre Hospitalier Universitaire de Lille (France), and associates.
“This tool might be used by general practitioners or hepatologists to identify heavy drinkers at high risk for alcohol-related liver disease progression,” the investigators added. “This model might be used to adapt patient care pathways.”
The patients in this study were admitted to the hepatogastroenterology unit of a French hospital between 1982 and 1997. The Markov model incorporated seven stages of alcohol-related liver disease: normal liver (no fibrosis or steatosis), steatosis and F0-F2 fibrosis, alcohol-induced steatohepatitis and F0-F2 fibrosis, steatosis and F3-F4 fibrosis, alcohol-induced steatohepatitis and F3-F4 fibrosis, liver complications without steatohepatitis, and liver complications with alcohol-induced steatohepatitis. Liver complications were defined as hepatocellular carcinoma or liver decompensation (bilirubin >50 mmol/L, gastrointestinal hemorrhage, or ascites). Risk for progressing to liver complications was based on METAVIR score and onset of alcohol-induced steatohepatitis.
The researchers also looked specifically at F3-F4 (severe) fibrosis because of its clinical significance and common use as a study endpoint. Among 40-year-olds with a 15-year history of heavy drinking, the estimated prevalence of alcohol-induced steatohepatitis was 30.0% for men and 33.3% for women. The 5-year risk for liver complications was higher in women (30.1%) than men (24.5%) and was highest among women with baseline alcohol-induced steatohepatitis (41.0%). Overall, women had a 24.8% greater risk for disease progression than men (hazard ratio, 1.248).
Risk for liver complications also increased with age, and each 1-year increase in age at the beginning of heavy drinking heightened the risk for disease progression by 3.8%, regardless of stage of liver disease. “Based on these predictions, 50-year-old women are a high-risk subgroup of [alcohol-related liver] disease progression and should receive close follow-up,” the researchers wrote.
In addition, obese individuals (BMI, 30) had an 11.8% greater risk for progression of alcohol-related liver disease, compared with those with a BMI of 22. Consuming an additional 10 grams of alcohol per day had less impact on risk, the researchers noted.
“If patients are identified as being heavy drinkers by the general practitioner with no evaluation of fibrosis, these patients should be referred to a hepatologist. Nevertheless, we think that the threshold defining the high-risk population, which has been arbitrarily fixed at 5%, should be discussed by experts because it affects the patient’s care pathway. An online application is being developed to help clinicians and general practitioners in their daily practice,” they wrote.
No funding sources were reported. Ms. Delacôte reported having no conflicts of interest. Three coinvestigators disclosed ties to AbbVie, Bayer Healthcare, Eisai, Gilead, MSD, Novartis, Sanofi, and Servier. The others reported having no conflicts.
SOURCE: Delacôte C et al. Clin Gastroenterol Hepatol. 2020 Jan 11. doi: 10.1016/j.cgh.2019.12.041.
In the life of a hepatologist few things are as gratifying as when a patient with alcohol-related liver disease (ALD) quits drinking. Though we wish this were the norm, ALD is both increasingly common and morbid. Tools to connect with and empower real change in our patients with ALD are urgently needed. Unfortunately, our toolbox is somewhat bare.
To improve, we must become accustomed to (and partner with experts in) the care of substance use disorder. We must learn to maximize the impact of our counseling on our patients. Behavioral interventions for ALD require goal-setting and self-regulation and both depend on the patient’s outcome expectations. All would be immeasurably strengthened with concrete prognostic data.
This is why the Delacôte study is important. The authors create a multistate model with inputs from cohorts of patients with biopsy-proven and staged ALD. The result is a specific 5-year risk of cirrhotic decompensation or hepatocellular carcinoma tailored to the patient’s age, sex, body mass index, alcohol use duration, and liver histology. Although this model’s estimates have confidence intervals and their generalizability would be improved if histology were replaced with noninvasive indices, these data are amongst the most tangible illustrations of risk available for patient-doctor deliberations.
Knowledge, when communicated effectively, is the cornerstone of behavioral change. Translating the abstract concept of progressive ALD into personalized, modifiable risks is a leap forward. We have a new tool, let’s use it.
Elliot B. Tapper, MD, is an assistant professor in gastroenterology and internal medicine at Michigan Medicine, Ann Arbor. He has no conflicts of interest.
In the life of a hepatologist few things are as gratifying as when a patient with alcohol-related liver disease (ALD) quits drinking. Though we wish this were the norm, ALD is both increasingly common and morbid. Tools to connect with and empower real change in our patients with ALD are urgently needed. Unfortunately, our toolbox is somewhat bare.
To improve, we must become accustomed to (and partner with experts in) the care of substance use disorder. We must learn to maximize the impact of our counseling on our patients. Behavioral interventions for ALD require goal-setting and self-regulation and both depend on the patient’s outcome expectations. All would be immeasurably strengthened with concrete prognostic data.
This is why the Delacôte study is important. The authors create a multistate model with inputs from cohorts of patients with biopsy-proven and staged ALD. The result is a specific 5-year risk of cirrhotic decompensation or hepatocellular carcinoma tailored to the patient’s age, sex, body mass index, alcohol use duration, and liver histology. Although this model’s estimates have confidence intervals and their generalizability would be improved if histology were replaced with noninvasive indices, these data are amongst the most tangible illustrations of risk available for patient-doctor deliberations.
Knowledge, when communicated effectively, is the cornerstone of behavioral change. Translating the abstract concept of progressive ALD into personalized, modifiable risks is a leap forward. We have a new tool, let’s use it.
Elliot B. Tapper, MD, is an assistant professor in gastroenterology and internal medicine at Michigan Medicine, Ann Arbor. He has no conflicts of interest.
In the life of a hepatologist few things are as gratifying as when a patient with alcohol-related liver disease (ALD) quits drinking. Though we wish this were the norm, ALD is both increasingly common and morbid. Tools to connect with and empower real change in our patients with ALD are urgently needed. Unfortunately, our toolbox is somewhat bare.
To improve, we must become accustomed to (and partner with experts in) the care of substance use disorder. We must learn to maximize the impact of our counseling on our patients. Behavioral interventions for ALD require goal-setting and self-regulation and both depend on the patient’s outcome expectations. All would be immeasurably strengthened with concrete prognostic data.
This is why the Delacôte study is important. The authors create a multistate model with inputs from cohorts of patients with biopsy-proven and staged ALD. The result is a specific 5-year risk of cirrhotic decompensation or hepatocellular carcinoma tailored to the patient’s age, sex, body mass index, alcohol use duration, and liver histology. Although this model’s estimates have confidence intervals and their generalizability would be improved if histology were replaced with noninvasive indices, these data are amongst the most tangible illustrations of risk available for patient-doctor deliberations.
Knowledge, when communicated effectively, is the cornerstone of behavioral change. Translating the abstract concept of progressive ALD into personalized, modifiable risks is a leap forward. We have a new tool, let’s use it.
Elliot B. Tapper, MD, is an assistant professor in gastroenterology and internal medicine at Michigan Medicine, Ann Arbor. He has no conflicts of interest.
In heavy drinkers with alcohol-related liver disease, a Markov model based on age, sex, body mass index, and duration and extent of alcohol use predicted risk for disease progression, researchers reported in Clinical Gastroenterology and Hepatology.
The study included 2,334 hospitalized adults with consistently abnormal liver test results who had consumed at least 50 grams of alcohol (about 3.5-4 drinks) per day for the previous 5 years. The model was developed using data from 1,599 individuals with baseline liver biopsies and validated in 735 individuals with no baseline liver biopsies but available data on the presence or absence of hepatic decompensation.
For a 40-year-old man with F0-F2 fibrosis who had been drinking alcohol for 15 years, who drank 150 grams of alcohol daily, and who had a body mass index (BMI) of 22 kg/m2, the model predicted a 31.8% likelihood of having a normal liver at baseline, a 61.5% probability of baseline steatosis, and a 6.7% probability of baseline steatohepatitis. In women with the same baseline variables, respective probabilities were 25.1%, 66.5%, and 8.4%. Based on these findings, the 5-year weighted risk for liver complications ranged from 0.2% for men with normal initial liver findings to 10.3% for men with baseline steatohepatitis. Among women, the corresponding risk estimates ranged from 0.5% to 14.7%, wrote PhD student Claire Delacôte of Centre Hospitalier Universitaire de Lille (France), and associates.
“This tool might be used by general practitioners or hepatologists to identify heavy drinkers at high risk for alcohol-related liver disease progression,” the investigators added. “This model might be used to adapt patient care pathways.”
The patients in this study were admitted to the hepatogastroenterology unit of a French hospital between 1982 and 1997. The Markov model incorporated seven stages of alcohol-related liver disease: normal liver (no fibrosis or steatosis), steatosis and F0-F2 fibrosis, alcohol-induced steatohepatitis and F0-F2 fibrosis, steatosis and F3-F4 fibrosis, alcohol-induced steatohepatitis and F3-F4 fibrosis, liver complications without steatohepatitis, and liver complications with alcohol-induced steatohepatitis. Liver complications were defined as hepatocellular carcinoma or liver decompensation (bilirubin >50 mmol/L, gastrointestinal hemorrhage, or ascites). Risk for progressing to liver complications was based on METAVIR score and onset of alcohol-induced steatohepatitis.
The researchers also looked specifically at F3-F4 (severe) fibrosis because of its clinical significance and common use as a study endpoint. Among 40-year-olds with a 15-year history of heavy drinking, the estimated prevalence of alcohol-induced steatohepatitis was 30.0% for men and 33.3% for women. The 5-year risk for liver complications was higher in women (30.1%) than men (24.5%) and was highest among women with baseline alcohol-induced steatohepatitis (41.0%). Overall, women had a 24.8% greater risk for disease progression than men (hazard ratio, 1.248).
Risk for liver complications also increased with age, and each 1-year increase in age at the beginning of heavy drinking heightened the risk for disease progression by 3.8%, regardless of stage of liver disease. “Based on these predictions, 50-year-old women are a high-risk subgroup of [alcohol-related liver] disease progression and should receive close follow-up,” the researchers wrote.
In addition, obese individuals (BMI, 30) had an 11.8% greater risk for progression of alcohol-related liver disease, compared with those with a BMI of 22. Consuming an additional 10 grams of alcohol per day had less impact on risk, the researchers noted.
“If patients are identified as being heavy drinkers by the general practitioner with no evaluation of fibrosis, these patients should be referred to a hepatologist. Nevertheless, we think that the threshold defining the high-risk population, which has been arbitrarily fixed at 5%, should be discussed by experts because it affects the patient’s care pathway. An online application is being developed to help clinicians and general practitioners in their daily practice,” they wrote.
No funding sources were reported. Ms. Delacôte reported having no conflicts of interest. Three coinvestigators disclosed ties to AbbVie, Bayer Healthcare, Eisai, Gilead, MSD, Novartis, Sanofi, and Servier. The others reported having no conflicts.
SOURCE: Delacôte C et al. Clin Gastroenterol Hepatol. 2020 Jan 11. doi: 10.1016/j.cgh.2019.12.041.
In heavy drinkers with alcohol-related liver disease, a Markov model based on age, sex, body mass index, and duration and extent of alcohol use predicted risk for disease progression, researchers reported in Clinical Gastroenterology and Hepatology.
The study included 2,334 hospitalized adults with consistently abnormal liver test results who had consumed at least 50 grams of alcohol (about 3.5-4 drinks) per day for the previous 5 years. The model was developed using data from 1,599 individuals with baseline liver biopsies and validated in 735 individuals with no baseline liver biopsies but available data on the presence or absence of hepatic decompensation.
For a 40-year-old man with F0-F2 fibrosis who had been drinking alcohol for 15 years, who drank 150 grams of alcohol daily, and who had a body mass index (BMI) of 22 kg/m2, the model predicted a 31.8% likelihood of having a normal liver at baseline, a 61.5% probability of baseline steatosis, and a 6.7% probability of baseline steatohepatitis. In women with the same baseline variables, respective probabilities were 25.1%, 66.5%, and 8.4%. Based on these findings, the 5-year weighted risk for liver complications ranged from 0.2% for men with normal initial liver findings to 10.3% for men with baseline steatohepatitis. Among women, the corresponding risk estimates ranged from 0.5% to 14.7%, wrote PhD student Claire Delacôte of Centre Hospitalier Universitaire de Lille (France), and associates.
“This tool might be used by general practitioners or hepatologists to identify heavy drinkers at high risk for alcohol-related liver disease progression,” the investigators added. “This model might be used to adapt patient care pathways.”
The patients in this study were admitted to the hepatogastroenterology unit of a French hospital between 1982 and 1997. The Markov model incorporated seven stages of alcohol-related liver disease: normal liver (no fibrosis or steatosis), steatosis and F0-F2 fibrosis, alcohol-induced steatohepatitis and F0-F2 fibrosis, steatosis and F3-F4 fibrosis, alcohol-induced steatohepatitis and F3-F4 fibrosis, liver complications without steatohepatitis, and liver complications with alcohol-induced steatohepatitis. Liver complications were defined as hepatocellular carcinoma or liver decompensation (bilirubin >50 mmol/L, gastrointestinal hemorrhage, or ascites). Risk for progressing to liver complications was based on METAVIR score and onset of alcohol-induced steatohepatitis.
The researchers also looked specifically at F3-F4 (severe) fibrosis because of its clinical significance and common use as a study endpoint. Among 40-year-olds with a 15-year history of heavy drinking, the estimated prevalence of alcohol-induced steatohepatitis was 30.0% for men and 33.3% for women. The 5-year risk for liver complications was higher in women (30.1%) than men (24.5%) and was highest among women with baseline alcohol-induced steatohepatitis (41.0%). Overall, women had a 24.8% greater risk for disease progression than men (hazard ratio, 1.248).
Risk for liver complications also increased with age, and each 1-year increase in age at the beginning of heavy drinking heightened the risk for disease progression by 3.8%, regardless of stage of liver disease. “Based on these predictions, 50-year-old women are a high-risk subgroup of [alcohol-related liver] disease progression and should receive close follow-up,” the researchers wrote.
In addition, obese individuals (BMI, 30) had an 11.8% greater risk for progression of alcohol-related liver disease, compared with those with a BMI of 22. Consuming an additional 10 grams of alcohol per day had less impact on risk, the researchers noted.
“If patients are identified as being heavy drinkers by the general practitioner with no evaluation of fibrosis, these patients should be referred to a hepatologist. Nevertheless, we think that the threshold defining the high-risk population, which has been arbitrarily fixed at 5%, should be discussed by experts because it affects the patient’s care pathway. An online application is being developed to help clinicians and general practitioners in their daily practice,” they wrote.
No funding sources were reported. Ms. Delacôte reported having no conflicts of interest. Three coinvestigators disclosed ties to AbbVie, Bayer Healthcare, Eisai, Gilead, MSD, Novartis, Sanofi, and Servier. The others reported having no conflicts.
SOURCE: Delacôte C et al. Clin Gastroenterol Hepatol. 2020 Jan 11. doi: 10.1016/j.cgh.2019.12.041.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Switching to low-inflammatory diet linked to lower risk for Crohn’s disease
Among adults who consumed a proinflammatory diet, switching to a diet with lower inflammatory potential was associated with a significant subsequent decrease in risk for Crohn’s disease, according to a study of three longitudinal observational cohorts.
Researchers calculated empirical dietary inflammatory pattern (EDIP) scores based on food-frequency questionnaires completed by 166,903 women and 41,931 men who participated in the Nurses’ Health Study, the Nurses’ Health Study II, and the Health Professionals Follow-up Study. Questionnaires were completed at two time points, separated by 8 years. Overall, adults whose cumulative average EDIP scores fell within the highest quartile – meaning their diets had the highest inflammatory potential – were at 51% greater risk for developing Crohn’s disease than were adults whose diets fell within the lowest EDIP quartile (hazard ratio, 1.51; 95% confidence interval, 1.19-2.07; P = .01).
Strikingly, however, adults who initially consumed a proinflammatory diet (which is high in calories, red meat, high-fat dairy, and refined grains) and then switched to a low-inflammatory diet (one based around fruit, vegetables, legumes, whole grains, fish, and lean protein) had a statistically similar risk for Crohn’s disease as adults who consumed a low-inflammatory diet at both time points. The 95% confidence interval for the hazard ratio crossed 1.0 (HR, 1.51; 95% CI, 0.76-3.00). In contrast, adults who initially consumed a low-inflammatory diet and later changed to a proinflammatory diet were at twofold greater risk for Crohn’s disease than were those who remained on a low-inflammatory diet (HR, 2.05; 95% CI, 1.10-3.79).
These findings accounted for potential confounders, such as age, study cohort, time period, race, smoking, physical activity, and use of oral contraceptives and hormone replacement therapy, wrote Chun-Han Lo, MD, of the Harvard T.H. Chan School of Public Health, Boston, together with his associates in Gastroenterology.
The EDIP score is a weighted sum of 18 food groups that characterizes the potential for dietary inflammation based on circulating concentrations of inflammatory biomarkers. A proinflammatory diet may “trigger the onset of intestinal inflammation by inducing changes in [the] gut microbiome, altering host homeostasis, and regulating T-cell immune response,” the investigators noted.
In this study, which included nearly 5 million person-years of follow-up, 328 individuals were diagnosed with Crohn’s disease and 428 individuals developed ulcerative colitis. Median age at inflammatory bowel disease diagnosis was 55 years, with a range of 29-85 years. Notably, change in EDIP score was not linked to ulcerative colitis risk. Diet may be more relevant in Crohn’s disease than ulcerative colitis, and dietary factors linked to ulcerative colitis were not associated with inflammatory markers in the cohorts and, thus, were not factored into EDIP score, the researchers wrote.
The link between EDIP score and Crohn’s disease in this study did not change after controlling for fiber intake. Red wine (which contains anti-inflammatory resveratrol) might be a protective factor, the researchers hypothesized. They also found that pizza – a processed, calorie-dense food – was significantly inversely linked to inflammatory markers, perhaps because pizza contains abundant lycopene (from tomato paste) and fat (which facilitate lycopene absorption).
Prior studies on diet and inflammatory bowel disease assessed diets at only one time point and categorized dietary habits based on food groups, rather than linking foods with inflammatory markers, the researchers wrote. “Here, by identifying a combination of food groups predictive of circulating markers of inflammation, we provide a more robust evidence base behind the association of these foods with inflammation and inflammatory bowel disease.”
Most study participants were white health professionals. The researchers noted that “studies of dietary risk factors have not revealed a strong ethnicity-specific association, [but] extrapolating our findings to individuals of other ethnicity should be performed with caution.”
The National Institutes of Health, the Beker Foundation, the Chleck Family Foundation, and the Crohn’s and Colitis Foundation provided funding. Three coinvestigators disclosed ties to AbbVie, Bayer Pharma AG, Boehringer Ingelheim, Gilead, Janssen, Kyn Therapeutics, Merck Pfizer, Policy Analysis Inc., and Takeda.
SOURCE: Lo C-H et al. Gastroenterology. 2020 May 1. doi: 10.1053/j.gastro.2020.05.011.
Diet is the single most modifiable risk factor influencing inflammatory bowel disease (IBD) development. Accordingly, animal studies show that specific nutrients and food additives impact gut barrier function and/or microbiota, thereby influencing disease development. However, using these studies to provide humans practical dietary advice has been difficult, in part because effects of isolated food components can be quite different from those of complex foods. The complex nature of human foods has also stymied epidemiologic approaches to determine how diet influences IBD risk. This difficulty is exacerbated by the potential of IBD itself to influence diet, likely beginning long before disease diagnosis.
Lo and colleagues surmount these problems by developing the “empirical dietary inflammatory pattern” (EDIP), which is a metric that quantifies the proinflammatory potential of one’s overall diet based on the extent to which its components associate with proinflammatory cytokine levels in a large healthy human cohort. Retrospective application of this metric to three large human cohorts found that consumption of proinflammatory diets increased risk of developing Crohn’s disease but not ulcerative colitis. This indicates, perhaps not surprisingly, that a central means by which diet influences risk of Crohn’s is by promoting inflammation in susceptible hosts. Furthermore, while this approach implicated the usual suspects, such as low-fiber processed foods, in promoting Crohn’s, it also found a protective role for pizza, perhaps reflecting the anti-inflammatory action of its tomato-based components. It should soon be possible for persons with high genetic risk for Crohn’s to use the EDIP to discern how their diet is influencing that risk and, moreover, designing practical strategies to mitigate it.
Andrew T. Gewirtz, PhD, is a professor at Georgia State University’s Institute for Biomedical Sciences, Atlanta. He has no conflicts.
Diet is the single most modifiable risk factor influencing inflammatory bowel disease (IBD) development. Accordingly, animal studies show that specific nutrients and food additives impact gut barrier function and/or microbiota, thereby influencing disease development. However, using these studies to provide humans practical dietary advice has been difficult, in part because effects of isolated food components can be quite different from those of complex foods. The complex nature of human foods has also stymied epidemiologic approaches to determine how diet influences IBD risk. This difficulty is exacerbated by the potential of IBD itself to influence diet, likely beginning long before disease diagnosis.
Lo and colleagues surmount these problems by developing the “empirical dietary inflammatory pattern” (EDIP), which is a metric that quantifies the proinflammatory potential of one’s overall diet based on the extent to which its components associate with proinflammatory cytokine levels in a large healthy human cohort. Retrospective application of this metric to three large human cohorts found that consumption of proinflammatory diets increased risk of developing Crohn’s disease but not ulcerative colitis. This indicates, perhaps not surprisingly, that a central means by which diet influences risk of Crohn’s is by promoting inflammation in susceptible hosts. Furthermore, while this approach implicated the usual suspects, such as low-fiber processed foods, in promoting Crohn’s, it also found a protective role for pizza, perhaps reflecting the anti-inflammatory action of its tomato-based components. It should soon be possible for persons with high genetic risk for Crohn’s to use the EDIP to discern how their diet is influencing that risk and, moreover, designing practical strategies to mitigate it.
Andrew T. Gewirtz, PhD, is a professor at Georgia State University’s Institute for Biomedical Sciences, Atlanta. He has no conflicts.
Diet is the single most modifiable risk factor influencing inflammatory bowel disease (IBD) development. Accordingly, animal studies show that specific nutrients and food additives impact gut barrier function and/or microbiota, thereby influencing disease development. However, using these studies to provide humans practical dietary advice has been difficult, in part because effects of isolated food components can be quite different from those of complex foods. The complex nature of human foods has also stymied epidemiologic approaches to determine how diet influences IBD risk. This difficulty is exacerbated by the potential of IBD itself to influence diet, likely beginning long before disease diagnosis.
Lo and colleagues surmount these problems by developing the “empirical dietary inflammatory pattern” (EDIP), which is a metric that quantifies the proinflammatory potential of one’s overall diet based on the extent to which its components associate with proinflammatory cytokine levels in a large healthy human cohort. Retrospective application of this metric to three large human cohorts found that consumption of proinflammatory diets increased risk of developing Crohn’s disease but not ulcerative colitis. This indicates, perhaps not surprisingly, that a central means by which diet influences risk of Crohn’s is by promoting inflammation in susceptible hosts. Furthermore, while this approach implicated the usual suspects, such as low-fiber processed foods, in promoting Crohn’s, it also found a protective role for pizza, perhaps reflecting the anti-inflammatory action of its tomato-based components. It should soon be possible for persons with high genetic risk for Crohn’s to use the EDIP to discern how their diet is influencing that risk and, moreover, designing practical strategies to mitigate it.
Andrew T. Gewirtz, PhD, is a professor at Georgia State University’s Institute for Biomedical Sciences, Atlanta. He has no conflicts.
Among adults who consumed a proinflammatory diet, switching to a diet with lower inflammatory potential was associated with a significant subsequent decrease in risk for Crohn’s disease, according to a study of three longitudinal observational cohorts.
Researchers calculated empirical dietary inflammatory pattern (EDIP) scores based on food-frequency questionnaires completed by 166,903 women and 41,931 men who participated in the Nurses’ Health Study, the Nurses’ Health Study II, and the Health Professionals Follow-up Study. Questionnaires were completed at two time points, separated by 8 years. Overall, adults whose cumulative average EDIP scores fell within the highest quartile – meaning their diets had the highest inflammatory potential – were at 51% greater risk for developing Crohn’s disease than were adults whose diets fell within the lowest EDIP quartile (hazard ratio, 1.51; 95% confidence interval, 1.19-2.07; P = .01).
Strikingly, however, adults who initially consumed a proinflammatory diet (which is high in calories, red meat, high-fat dairy, and refined grains) and then switched to a low-inflammatory diet (one based around fruit, vegetables, legumes, whole grains, fish, and lean protein) had a statistically similar risk for Crohn’s disease as adults who consumed a low-inflammatory diet at both time points. The 95% confidence interval for the hazard ratio crossed 1.0 (HR, 1.51; 95% CI, 0.76-3.00). In contrast, adults who initially consumed a low-inflammatory diet and later changed to a proinflammatory diet were at twofold greater risk for Crohn’s disease than were those who remained on a low-inflammatory diet (HR, 2.05; 95% CI, 1.10-3.79).
These findings accounted for potential confounders, such as age, study cohort, time period, race, smoking, physical activity, and use of oral contraceptives and hormone replacement therapy, wrote Chun-Han Lo, MD, of the Harvard T.H. Chan School of Public Health, Boston, together with his associates in Gastroenterology.
The EDIP score is a weighted sum of 18 food groups that characterizes the potential for dietary inflammation based on circulating concentrations of inflammatory biomarkers. A proinflammatory diet may “trigger the onset of intestinal inflammation by inducing changes in [the] gut microbiome, altering host homeostasis, and regulating T-cell immune response,” the investigators noted.
In this study, which included nearly 5 million person-years of follow-up, 328 individuals were diagnosed with Crohn’s disease and 428 individuals developed ulcerative colitis. Median age at inflammatory bowel disease diagnosis was 55 years, with a range of 29-85 years. Notably, change in EDIP score was not linked to ulcerative colitis risk. Diet may be more relevant in Crohn’s disease than ulcerative colitis, and dietary factors linked to ulcerative colitis were not associated with inflammatory markers in the cohorts and, thus, were not factored into EDIP score, the researchers wrote.
The link between EDIP score and Crohn’s disease in this study did not change after controlling for fiber intake. Red wine (which contains anti-inflammatory resveratrol) might be a protective factor, the researchers hypothesized. They also found that pizza – a processed, calorie-dense food – was significantly inversely linked to inflammatory markers, perhaps because pizza contains abundant lycopene (from tomato paste) and fat (which facilitate lycopene absorption).
Prior studies on diet and inflammatory bowel disease assessed diets at only one time point and categorized dietary habits based on food groups, rather than linking foods with inflammatory markers, the researchers wrote. “Here, by identifying a combination of food groups predictive of circulating markers of inflammation, we provide a more robust evidence base behind the association of these foods with inflammation and inflammatory bowel disease.”
Most study participants were white health professionals. The researchers noted that “studies of dietary risk factors have not revealed a strong ethnicity-specific association, [but] extrapolating our findings to individuals of other ethnicity should be performed with caution.”
The National Institutes of Health, the Beker Foundation, the Chleck Family Foundation, and the Crohn’s and Colitis Foundation provided funding. Three coinvestigators disclosed ties to AbbVie, Bayer Pharma AG, Boehringer Ingelheim, Gilead, Janssen, Kyn Therapeutics, Merck Pfizer, Policy Analysis Inc., and Takeda.
SOURCE: Lo C-H et al. Gastroenterology. 2020 May 1. doi: 10.1053/j.gastro.2020.05.011.
Among adults who consumed a proinflammatory diet, switching to a diet with lower inflammatory potential was associated with a significant subsequent decrease in risk for Crohn’s disease, according to a study of three longitudinal observational cohorts.
Researchers calculated empirical dietary inflammatory pattern (EDIP) scores based on food-frequency questionnaires completed by 166,903 women and 41,931 men who participated in the Nurses’ Health Study, the Nurses’ Health Study II, and the Health Professionals Follow-up Study. Questionnaires were completed at two time points, separated by 8 years. Overall, adults whose cumulative average EDIP scores fell within the highest quartile – meaning their diets had the highest inflammatory potential – were at 51% greater risk for developing Crohn’s disease than were adults whose diets fell within the lowest EDIP quartile (hazard ratio, 1.51; 95% confidence interval, 1.19-2.07; P = .01).
Strikingly, however, adults who initially consumed a proinflammatory diet (which is high in calories, red meat, high-fat dairy, and refined grains) and then switched to a low-inflammatory diet (one based around fruit, vegetables, legumes, whole grains, fish, and lean protein) had a statistically similar risk for Crohn’s disease as adults who consumed a low-inflammatory diet at both time points. The 95% confidence interval for the hazard ratio crossed 1.0 (HR, 1.51; 95% CI, 0.76-3.00). In contrast, adults who initially consumed a low-inflammatory diet and later changed to a proinflammatory diet were at twofold greater risk for Crohn’s disease than were those who remained on a low-inflammatory diet (HR, 2.05; 95% CI, 1.10-3.79).
These findings accounted for potential confounders, such as age, study cohort, time period, race, smoking, physical activity, and use of oral contraceptives and hormone replacement therapy, wrote Chun-Han Lo, MD, of the Harvard T.H. Chan School of Public Health, Boston, together with his associates in Gastroenterology.
The EDIP score is a weighted sum of 18 food groups that characterizes the potential for dietary inflammation based on circulating concentrations of inflammatory biomarkers. A proinflammatory diet may “trigger the onset of intestinal inflammation by inducing changes in [the] gut microbiome, altering host homeostasis, and regulating T-cell immune response,” the investigators noted.
In this study, which included nearly 5 million person-years of follow-up, 328 individuals were diagnosed with Crohn’s disease and 428 individuals developed ulcerative colitis. Median age at inflammatory bowel disease diagnosis was 55 years, with a range of 29-85 years. Notably, change in EDIP score was not linked to ulcerative colitis risk. Diet may be more relevant in Crohn’s disease than ulcerative colitis, and dietary factors linked to ulcerative colitis were not associated with inflammatory markers in the cohorts and, thus, were not factored into EDIP score, the researchers wrote.
The link between EDIP score and Crohn’s disease in this study did not change after controlling for fiber intake. Red wine (which contains anti-inflammatory resveratrol) might be a protective factor, the researchers hypothesized. They also found that pizza – a processed, calorie-dense food – was significantly inversely linked to inflammatory markers, perhaps because pizza contains abundant lycopene (from tomato paste) and fat (which facilitate lycopene absorption).
Prior studies on diet and inflammatory bowel disease assessed diets at only one time point and categorized dietary habits based on food groups, rather than linking foods with inflammatory markers, the researchers wrote. “Here, by identifying a combination of food groups predictive of circulating markers of inflammation, we provide a more robust evidence base behind the association of these foods with inflammation and inflammatory bowel disease.”
Most study participants were white health professionals. The researchers noted that “studies of dietary risk factors have not revealed a strong ethnicity-specific association, [but] extrapolating our findings to individuals of other ethnicity should be performed with caution.”
The National Institutes of Health, the Beker Foundation, the Chleck Family Foundation, and the Crohn’s and Colitis Foundation provided funding. Three coinvestigators disclosed ties to AbbVie, Bayer Pharma AG, Boehringer Ingelheim, Gilead, Janssen, Kyn Therapeutics, Merck Pfizer, Policy Analysis Inc., and Takeda.
SOURCE: Lo C-H et al. Gastroenterology. 2020 May 1. doi: 10.1053/j.gastro.2020.05.011.
FROM GASTROENTEROLOGY
AAP releases new policy statement on barrier protection for teens
For adolescent patients, routinely take a sexual history, discuss the use of barrier methods, and perform relevant examinations, screenings, and vaccinations, according to a new policy statement on barrier protection use from the American Academy of Pediatrics’ Committee on Adolescence.
The policy statement has been expanded to cover multiple types of sexual activity and methods of barrier protection. These include not only traditional condoms, but also internal condoms (available in the United States only by prescription) and dental dams (for use during oral sex) or a latex sheet. “Pediatricians and other clinicians are encouraged to provide barrier methods within their offices and support availability within their communities,” said Laura K. Grubb, MD, MPH, of Tufts Medical Center in Boston, who authored both the policy statement and the technical report.
Counsel adolescents that abstaining from sexual intercourse is the best way to prevent genital sexually transmitted infections (STIs), HIV infection, and unplanned pregnancy. Also encourage and support consistent, correct barrier method use – in addition to other reliable contraception, if patients are sexually active or are thinking about becoming sexually active – the policy statement notes. Emphasize that all partners share responsibility to prevent STIs and unplanned pregnancies. “Adolescents with intellectual and physical disabilities are an overlooked group when it comes to sexual behavior, but they have similar rates of sexual behaviors when compared with their peers without disabilities,” Dr. Grubb and colleagues emphasized in the policy statement.
This is key because Centers for Disease Control and Prevention 2017 data showed that in the United States, “456,000 adolescent and young women younger than 20 years became pregnant; 448,000 of those pregnancies were among 15- to 19-year-olds, and 7,400 were among those 14 years of age and younger,” according to the technical report accompanying the policy statement. Also, “new cases of STIs increased 31% in the United States from 2013 to 2017, with half of the 2.3 million new STIs reported each year among young people 15 to 24 years of age.”
Parents may need support and encouragement to talk with their children about sex, sexuality, and the use of barrier methods to prevent STIs. Dr. Grubb and colleagues recommend via the policy statement: “Actively communicate to parents and communities that making barrier methods available to adolescents does not increase the onset or frequency of adolescent sexual activity, and that use of barrier methods can help decrease rates of unintended pregnancy and acquisition of STIs.”
Use Bright Futures: Guidelines for Health Supervision of Infants, Children, and Adolescents, Fourth Edition, for guidance on supporting parents and adolescents in promoting healthy sexual development and sexuality, including discussions of barrier methods.
Some groups of adolescents may use barrier methods less consistently because they perceive themselves to be lower risk. These include adolescents who use preexposure prophylaxis or nonbarrier contraception, who identify as bisexual or lesbian, or who are in established relationships. Monitor these patients to assess their risk and need for additional counseling. In the technical report, studies are cited finding that barrier methods are used less consistently during oral sex and that condom use is lower among cisgender and transgender females, and among adolescents who self-identify as gay, lesbian, or bisexual, compared with other groups.
In the policy statement, Dr. Grubb and colleagues call on pediatricians to advocate for more research and better access to barrier methods, especially for higher-risk adolescents and those living in underserved areas. In particular, school education programs on barrier methods can reach large adolescent groups and provide a “comprehensive array of educational and health care resources.”
Katie Brigham, MD, a pediatrician at MassGeneral Hospital for Children in Boston, affirmed the recommendations in the new policy statement (which she did not help write or research). “Even though the pregnancy rate is dropping in the United States, STI rates are increasing, so it is vital that pediatricians and other providers of adolescents and young adults counsel all their patients, regardless of gender and sexual orientation, of the importance of barrier methods when having oral, vaginal, or anal sex,” she said in an interview.
Dr. Brigham praised the technical report, adding that she found no major weaknesses in its methodology. “For future research, it would be interesting to see if there are different rates of pregnancy and STIs in pediatric practices that provide condoms and other barrier methods free to their patients, compared to those that do not.”
No external funding sources were reported. Dr. Grubb and Dr. Brigham reported having no relevant financial disclosures.
SOURCE: Grubb LK et al. Pediatrics. 2020 Jul 20. doi: 10.1542/peds.2020-007237.
For adolescent patients, routinely take a sexual history, discuss the use of barrier methods, and perform relevant examinations, screenings, and vaccinations, according to a new policy statement on barrier protection use from the American Academy of Pediatrics’ Committee on Adolescence.
The policy statement has been expanded to cover multiple types of sexual activity and methods of barrier protection. These include not only traditional condoms, but also internal condoms (available in the United States only by prescription) and dental dams (for use during oral sex) or a latex sheet. “Pediatricians and other clinicians are encouraged to provide barrier methods within their offices and support availability within their communities,” said Laura K. Grubb, MD, MPH, of Tufts Medical Center in Boston, who authored both the policy statement and the technical report.
Counsel adolescents that abstaining from sexual intercourse is the best way to prevent genital sexually transmitted infections (STIs), HIV infection, and unplanned pregnancy. Also encourage and support consistent, correct barrier method use – in addition to other reliable contraception, if patients are sexually active or are thinking about becoming sexually active – the policy statement notes. Emphasize that all partners share responsibility to prevent STIs and unplanned pregnancies. “Adolescents with intellectual and physical disabilities are an overlooked group when it comes to sexual behavior, but they have similar rates of sexual behaviors when compared with their peers without disabilities,” Dr. Grubb and colleagues emphasized in the policy statement.
This is key because Centers for Disease Control and Prevention 2017 data showed that in the United States, “456,000 adolescent and young women younger than 20 years became pregnant; 448,000 of those pregnancies were among 15- to 19-year-olds, and 7,400 were among those 14 years of age and younger,” according to the technical report accompanying the policy statement. Also, “new cases of STIs increased 31% in the United States from 2013 to 2017, with half of the 2.3 million new STIs reported each year among young people 15 to 24 years of age.”
Parents may need support and encouragement to talk with their children about sex, sexuality, and the use of barrier methods to prevent STIs. Dr. Grubb and colleagues recommend via the policy statement: “Actively communicate to parents and communities that making barrier methods available to adolescents does not increase the onset or frequency of adolescent sexual activity, and that use of barrier methods can help decrease rates of unintended pregnancy and acquisition of STIs.”
Use Bright Futures: Guidelines for Health Supervision of Infants, Children, and Adolescents, Fourth Edition, for guidance on supporting parents and adolescents in promoting healthy sexual development and sexuality, including discussions of barrier methods.
Some groups of adolescents may use barrier methods less consistently because they perceive themselves to be lower risk. These include adolescents who use preexposure prophylaxis or nonbarrier contraception, who identify as bisexual or lesbian, or who are in established relationships. Monitor these patients to assess their risk and need for additional counseling. In the technical report, studies are cited finding that barrier methods are used less consistently during oral sex and that condom use is lower among cisgender and transgender females, and among adolescents who self-identify as gay, lesbian, or bisexual, compared with other groups.
In the policy statement, Dr. Grubb and colleagues call on pediatricians to advocate for more research and better access to barrier methods, especially for higher-risk adolescents and those living in underserved areas. In particular, school education programs on barrier methods can reach large adolescent groups and provide a “comprehensive array of educational and health care resources.”
Katie Brigham, MD, a pediatrician at MassGeneral Hospital for Children in Boston, affirmed the recommendations in the new policy statement (which she did not help write or research). “Even though the pregnancy rate is dropping in the United States, STI rates are increasing, so it is vital that pediatricians and other providers of adolescents and young adults counsel all their patients, regardless of gender and sexual orientation, of the importance of barrier methods when having oral, vaginal, or anal sex,” she said in an interview.
Dr. Brigham praised the technical report, adding that she found no major weaknesses in its methodology. “For future research, it would be interesting to see if there are different rates of pregnancy and STIs in pediatric practices that provide condoms and other barrier methods free to their patients, compared to those that do not.”
No external funding sources were reported. Dr. Grubb and Dr. Brigham reported having no relevant financial disclosures.
SOURCE: Grubb LK et al. Pediatrics. 2020 Jul 20. doi: 10.1542/peds.2020-007237.
For adolescent patients, routinely take a sexual history, discuss the use of barrier methods, and perform relevant examinations, screenings, and vaccinations, according to a new policy statement on barrier protection use from the American Academy of Pediatrics’ Committee on Adolescence.
The policy statement has been expanded to cover multiple types of sexual activity and methods of barrier protection. These include not only traditional condoms, but also internal condoms (available in the United States only by prescription) and dental dams (for use during oral sex) or a latex sheet. “Pediatricians and other clinicians are encouraged to provide barrier methods within their offices and support availability within their communities,” said Laura K. Grubb, MD, MPH, of Tufts Medical Center in Boston, who authored both the policy statement and the technical report.
Counsel adolescents that abstaining from sexual intercourse is the best way to prevent genital sexually transmitted infections (STIs), HIV infection, and unplanned pregnancy. Also encourage and support consistent, correct barrier method use – in addition to other reliable contraception, if patients are sexually active or are thinking about becoming sexually active – the policy statement notes. Emphasize that all partners share responsibility to prevent STIs and unplanned pregnancies. “Adolescents with intellectual and physical disabilities are an overlooked group when it comes to sexual behavior, but they have similar rates of sexual behaviors when compared with their peers without disabilities,” Dr. Grubb and colleagues emphasized in the policy statement.
This is key because Centers for Disease Control and Prevention 2017 data showed that in the United States, “456,000 adolescent and young women younger than 20 years became pregnant; 448,000 of those pregnancies were among 15- to 19-year-olds, and 7,400 were among those 14 years of age and younger,” according to the technical report accompanying the policy statement. Also, “new cases of STIs increased 31% in the United States from 2013 to 2017, with half of the 2.3 million new STIs reported each year among young people 15 to 24 years of age.”
Parents may need support and encouragement to talk with their children about sex, sexuality, and the use of barrier methods to prevent STIs. Dr. Grubb and colleagues recommend via the policy statement: “Actively communicate to parents and communities that making barrier methods available to adolescents does not increase the onset or frequency of adolescent sexual activity, and that use of barrier methods can help decrease rates of unintended pregnancy and acquisition of STIs.”
Use Bright Futures: Guidelines for Health Supervision of Infants, Children, and Adolescents, Fourth Edition, for guidance on supporting parents and adolescents in promoting healthy sexual development and sexuality, including discussions of barrier methods.
Some groups of adolescents may use barrier methods less consistently because they perceive themselves to be lower risk. These include adolescents who use preexposure prophylaxis or nonbarrier contraception, who identify as bisexual or lesbian, or who are in established relationships. Monitor these patients to assess their risk and need for additional counseling. In the technical report, studies are cited finding that barrier methods are used less consistently during oral sex and that condom use is lower among cisgender and transgender females, and among adolescents who self-identify as gay, lesbian, or bisexual, compared with other groups.
In the policy statement, Dr. Grubb and colleagues call on pediatricians to advocate for more research and better access to barrier methods, especially for higher-risk adolescents and those living in underserved areas. In particular, school education programs on barrier methods can reach large adolescent groups and provide a “comprehensive array of educational and health care resources.”
Katie Brigham, MD, a pediatrician at MassGeneral Hospital for Children in Boston, affirmed the recommendations in the new policy statement (which she did not help write or research). “Even though the pregnancy rate is dropping in the United States, STI rates are increasing, so it is vital that pediatricians and other providers of adolescents and young adults counsel all their patients, regardless of gender and sexual orientation, of the importance of barrier methods when having oral, vaginal, or anal sex,” she said in an interview.
Dr. Brigham praised the technical report, adding that she found no major weaknesses in its methodology. “For future research, it would be interesting to see if there are different rates of pregnancy and STIs in pediatric practices that provide condoms and other barrier methods free to their patients, compared to those that do not.”
No external funding sources were reported. Dr. Grubb and Dr. Brigham reported having no relevant financial disclosures.
SOURCE: Grubb LK et al. Pediatrics. 2020 Jul 20. doi: 10.1542/peds.2020-007237.
FROM PEDIATRICS
Characterization of norovirus immunity in nonsecretor adults might provide vaccine model for children
Among nonsecretors – individuals who express a less diverse array of fucosylated histoblood group antigen carbohydrates (HBGAs) and consequently are less susceptible to some norovirus strains – natural infection with norovirus strain GII.2 induced cellular and antibody immunity that lasted for at least 30 days for T cells, monocytes, and dendritic cells and for at least 180 days for blocking antibodies, researchers reported.
“Multiple cellular lineages expressing interferon-gamma and tumor necrosis factor [TNF]–alpha dominated the response. Both T-cell and B-cell responses were cross-reactive with other GII strains, but not GI strains,” Lisa C. Lindesmith of the University of North Carolina, Chapel Hill, and her associates wrote in Cellular and Molecular Gastroenterology and Hepatology. The researchers also found that bile salts enable GII.2 to bind HBGAs produced by nonsecretors. “[I]n addition to HBGAs, one or more specific components of bile also is likely to be an essential co-factor for human norovirus attachment and infection,” the researchers wrote.
Susceptibility to norovirus depends on whether individuals express secretor enzyme, which is encoded by the FUT2 gene. Nonsecretors (who are FUT2–/–) express less varied HBGA, are susceptible to fewer norovirus strains, and are resistant to the predominant norovirus strain, GII.4. “Because future human norovirus vaccines will comprise GII.4 antigen, and because secretor phenotype impacts GII.4 infection and immunity, nonsecretors may mimic young children immunologically in response to GII.4 vaccination,” the researchers explained. But until now, most vaccines have focused on adult secretors, they said.
Their study focused on a familial norovirus outbreak in Chapel Hill that was the first to be characterized among nonsecretors who were naturally infected with norovirus GII.2. Four adults provided blood samples, and one provided a stool sample from which the researchers isolated and cloned the G11.2 capsid gene sequence. They used neutralization assays to study serologic immunity and flow cytometry to assess cellular activation and cytokine production in blood samples from the four cases and from seven healthy donors.
Norovirus GII.2 infection activated both innate and adaptive immunity and typical production of antiviral helper T cell (Th)1 and Th2 cytokines. The cellular immune response lasted at least 30 days, “long after symptom resolution,” the investigators wrote.
Compared with healthy donors, blood specimens from infected nonsecretors showed increases in non-class-switched memory, transitional B cells, and plasmablast B cells, and both naive and memory B cells also were positive for activation markers for at least 30 days after infection. Activated interferon-gamma+ T cells, natural killer cells, TNF-alpha+ monocytes, IL-10+, TNF-alpha+ myeloid dendritic cells, and TNF plasmacytoid dendritic cells also persisted for at least 30 days. Cross-reactive GII immunity was evident for at least 180 days. “GII.2 infection boosted cross-reactive blocking antibodies to GII.3, GII.14, and GII.17, as well as T-cell responses to GII.4, despite the lack of clear serologic evidence of previous GII.4 exposure,” the investigators wrote.
Based on prior reports that bile enhances norovirus growth or ligand binding, they inoculated specimens with chenodeoxycholic acid (CDCA) and glycochenodeoxycholic acid (GCDCA), pig bile, ox bile, or human bile. “Strikingly, the addition of bile enabled GII.2 Chapel Hill outbreak virus-like particle to bind to saliva from the four nonsecretor donors,” the researchers wrote. Bile acids “may override the genetic advantage of less-diverse HBGA expression in nonsecretors by improving the avidity of GII.2 binding to nonsecretor HBGAs, potentially paving the way for infection.” However, bile salts did not enable the GII.2 strain to replicate in human intestinal enteroid cells, which suggests that additional factors play into how norovirus enters human cells, according to the researchers.
The findings, they wrote, “support development of within-genogroup, cross-reactive antibody and T-cell immunity, key outcomes that may provide the foundation for eliciting broad immune responses after GII.4 vaccination in individuals with limited GII.4 immunity, including young children.”
The National Institutes of Health, the Wellcome Trust, the Centers for Disease Control and Prevention, and a Cancer Center Core support provided funding. Ms. Lindesmith and her associates reported having no relevant conflicts of interest.
SOURCE: Lindesmith LC et al. Cell Molec Gastroenterol Hepatol. 2020;10:245-67.
Noroviruses belonging to genogroup II.4 are the leading cause of acute gastroenteritis, but our understanding of norovirus immunity remains incomplete. Most studies have focused on humoral responses and have shown that antibodies may be short lived, strain specific, and not always protective against rechallenge. On the other hand, human innate and T-cell immunity have received little attention despite evidence from the mouse norovirus model that they are critical for limiting viral spread and clearing antigen.
In this study, Lindesmith et al. conducted broad phenotypic and functional analysis of innate and adaptive immune responses following infection with a GII.2 strain of norovirus. Their cohort consists of “nonsecretors,” subjects who express a limited repertoire of histoblood group antigens and are therefore naturally resistant to GII.4 infection. Since nonsecretors have no pre-existing immunity against GII.4 viruses, this system enables the authors to test cross-reactivity of GII.2-specific T cells against GII.4 virus-like particles (VLPs).
The authors showed broad immune activation against natural norovirus infection. Following GII.2 infection, T-cell responses persist for at least a month and, importantly, are cross-reactive against GII.4 VLPs. These findings suggest that T cells may target conserved viral epitopes and play an important role in long-term protection against reinfection.
Developing an effective norovirus vaccine will require a detailed understanding of immune correlates of protection, and this study is a step in the right direction. In future work, tracking epitope-specific T cells must further define the phenotype, functionality, and localization of the norovirus T-cell repertoire.
Vesselin Tomov, MD, PhD, is assistant professor of medicine at the Hospital of the University of Pennsylvania, Philadelphia. He has no conflicts of interest.
Noroviruses belonging to genogroup II.4 are the leading cause of acute gastroenteritis, but our understanding of norovirus immunity remains incomplete. Most studies have focused on humoral responses and have shown that antibodies may be short lived, strain specific, and not always protective against rechallenge. On the other hand, human innate and T-cell immunity have received little attention despite evidence from the mouse norovirus model that they are critical for limiting viral spread and clearing antigen.
In this study, Lindesmith et al. conducted broad phenotypic and functional analysis of innate and adaptive immune responses following infection with a GII.2 strain of norovirus. Their cohort consists of “nonsecretors,” subjects who express a limited repertoire of histoblood group antigens and are therefore naturally resistant to GII.4 infection. Since nonsecretors have no pre-existing immunity against GII.4 viruses, this system enables the authors to test cross-reactivity of GII.2-specific T cells against GII.4 virus-like particles (VLPs).
The authors showed broad immune activation against natural norovirus infection. Following GII.2 infection, T-cell responses persist for at least a month and, importantly, are cross-reactive against GII.4 VLPs. These findings suggest that T cells may target conserved viral epitopes and play an important role in long-term protection against reinfection.
Developing an effective norovirus vaccine will require a detailed understanding of immune correlates of protection, and this study is a step in the right direction. In future work, tracking epitope-specific T cells must further define the phenotype, functionality, and localization of the norovirus T-cell repertoire.
Vesselin Tomov, MD, PhD, is assistant professor of medicine at the Hospital of the University of Pennsylvania, Philadelphia. He has no conflicts of interest.
Noroviruses belonging to genogroup II.4 are the leading cause of acute gastroenteritis, but our understanding of norovirus immunity remains incomplete. Most studies have focused on humoral responses and have shown that antibodies may be short lived, strain specific, and not always protective against rechallenge. On the other hand, human innate and T-cell immunity have received little attention despite evidence from the mouse norovirus model that they are critical for limiting viral spread and clearing antigen.
In this study, Lindesmith et al. conducted broad phenotypic and functional analysis of innate and adaptive immune responses following infection with a GII.2 strain of norovirus. Their cohort consists of “nonsecretors,” subjects who express a limited repertoire of histoblood group antigens and are therefore naturally resistant to GII.4 infection. Since nonsecretors have no pre-existing immunity against GII.4 viruses, this system enables the authors to test cross-reactivity of GII.2-specific T cells against GII.4 virus-like particles (VLPs).
The authors showed broad immune activation against natural norovirus infection. Following GII.2 infection, T-cell responses persist for at least a month and, importantly, are cross-reactive against GII.4 VLPs. These findings suggest that T cells may target conserved viral epitopes and play an important role in long-term protection against reinfection.
Developing an effective norovirus vaccine will require a detailed understanding of immune correlates of protection, and this study is a step in the right direction. In future work, tracking epitope-specific T cells must further define the phenotype, functionality, and localization of the norovirus T-cell repertoire.
Vesselin Tomov, MD, PhD, is assistant professor of medicine at the Hospital of the University of Pennsylvania, Philadelphia. He has no conflicts of interest.
Among nonsecretors – individuals who express a less diverse array of fucosylated histoblood group antigen carbohydrates (HBGAs) and consequently are less susceptible to some norovirus strains – natural infection with norovirus strain GII.2 induced cellular and antibody immunity that lasted for at least 30 days for T cells, monocytes, and dendritic cells and for at least 180 days for blocking antibodies, researchers reported.
“Multiple cellular lineages expressing interferon-gamma and tumor necrosis factor [TNF]–alpha dominated the response. Both T-cell and B-cell responses were cross-reactive with other GII strains, but not GI strains,” Lisa C. Lindesmith of the University of North Carolina, Chapel Hill, and her associates wrote in Cellular and Molecular Gastroenterology and Hepatology. The researchers also found that bile salts enable GII.2 to bind HBGAs produced by nonsecretors. “[I]n addition to HBGAs, one or more specific components of bile also is likely to be an essential co-factor for human norovirus attachment and infection,” the researchers wrote.
Susceptibility to norovirus depends on whether individuals express secretor enzyme, which is encoded by the FUT2 gene. Nonsecretors (who are FUT2–/–) express less varied HBGA, are susceptible to fewer norovirus strains, and are resistant to the predominant norovirus strain, GII.4. “Because future human norovirus vaccines will comprise GII.4 antigen, and because secretor phenotype impacts GII.4 infection and immunity, nonsecretors may mimic young children immunologically in response to GII.4 vaccination,” the researchers explained. But until now, most vaccines have focused on adult secretors, they said.
Their study focused on a familial norovirus outbreak in Chapel Hill that was the first to be characterized among nonsecretors who were naturally infected with norovirus GII.2. Four adults provided blood samples, and one provided a stool sample from which the researchers isolated and cloned the G11.2 capsid gene sequence. They used neutralization assays to study serologic immunity and flow cytometry to assess cellular activation and cytokine production in blood samples from the four cases and from seven healthy donors.
Norovirus GII.2 infection activated both innate and adaptive immunity and typical production of antiviral helper T cell (Th)1 and Th2 cytokines. The cellular immune response lasted at least 30 days, “long after symptom resolution,” the investigators wrote.
Compared with healthy donors, blood specimens from infected nonsecretors showed increases in non-class-switched memory, transitional B cells, and plasmablast B cells, and both naive and memory B cells also were positive for activation markers for at least 30 days after infection. Activated interferon-gamma+ T cells, natural killer cells, TNF-alpha+ monocytes, IL-10+, TNF-alpha+ myeloid dendritic cells, and TNF plasmacytoid dendritic cells also persisted for at least 30 days. Cross-reactive GII immunity was evident for at least 180 days. “GII.2 infection boosted cross-reactive blocking antibodies to GII.3, GII.14, and GII.17, as well as T-cell responses to GII.4, despite the lack of clear serologic evidence of previous GII.4 exposure,” the investigators wrote.
Based on prior reports that bile enhances norovirus growth or ligand binding, they inoculated specimens with chenodeoxycholic acid (CDCA) and glycochenodeoxycholic acid (GCDCA), pig bile, ox bile, or human bile. “Strikingly, the addition of bile enabled GII.2 Chapel Hill outbreak virus-like particle to bind to saliva from the four nonsecretor donors,” the researchers wrote. Bile acids “may override the genetic advantage of less-diverse HBGA expression in nonsecretors by improving the avidity of GII.2 binding to nonsecretor HBGAs, potentially paving the way for infection.” However, bile salts did not enable the GII.2 strain to replicate in human intestinal enteroid cells, which suggests that additional factors play into how norovirus enters human cells, according to the researchers.
The findings, they wrote, “support development of within-genogroup, cross-reactive antibody and T-cell immunity, key outcomes that may provide the foundation for eliciting broad immune responses after GII.4 vaccination in individuals with limited GII.4 immunity, including young children.”
The National Institutes of Health, the Wellcome Trust, the Centers for Disease Control and Prevention, and a Cancer Center Core support provided funding. Ms. Lindesmith and her associates reported having no relevant conflicts of interest.
SOURCE: Lindesmith LC et al. Cell Molec Gastroenterol Hepatol. 2020;10:245-67.
Among nonsecretors – individuals who express a less diverse array of fucosylated histoblood group antigen carbohydrates (HBGAs) and consequently are less susceptible to some norovirus strains – natural infection with norovirus strain GII.2 induced cellular and antibody immunity that lasted for at least 30 days for T cells, monocytes, and dendritic cells and for at least 180 days for blocking antibodies, researchers reported.
“Multiple cellular lineages expressing interferon-gamma and tumor necrosis factor [TNF]–alpha dominated the response. Both T-cell and B-cell responses were cross-reactive with other GII strains, but not GI strains,” Lisa C. Lindesmith of the University of North Carolina, Chapel Hill, and her associates wrote in Cellular and Molecular Gastroenterology and Hepatology. The researchers also found that bile salts enable GII.2 to bind HBGAs produced by nonsecretors. “[I]n addition to HBGAs, one or more specific components of bile also is likely to be an essential co-factor for human norovirus attachment and infection,” the researchers wrote.
Susceptibility to norovirus depends on whether individuals express secretor enzyme, which is encoded by the FUT2 gene. Nonsecretors (who are FUT2–/–) express less varied HBGA, are susceptible to fewer norovirus strains, and are resistant to the predominant norovirus strain, GII.4. “Because future human norovirus vaccines will comprise GII.4 antigen, and because secretor phenotype impacts GII.4 infection and immunity, nonsecretors may mimic young children immunologically in response to GII.4 vaccination,” the researchers explained. But until now, most vaccines have focused on adult secretors, they said.
Their study focused on a familial norovirus outbreak in Chapel Hill that was the first to be characterized among nonsecretors who were naturally infected with norovirus GII.2. Four adults provided blood samples, and one provided a stool sample from which the researchers isolated and cloned the G11.2 capsid gene sequence. They used neutralization assays to study serologic immunity and flow cytometry to assess cellular activation and cytokine production in blood samples from the four cases and from seven healthy donors.
Norovirus GII.2 infection activated both innate and adaptive immunity and typical production of antiviral helper T cell (Th)1 and Th2 cytokines. The cellular immune response lasted at least 30 days, “long after symptom resolution,” the investigators wrote.
Compared with healthy donors, blood specimens from infected nonsecretors showed increases in non-class-switched memory, transitional B cells, and plasmablast B cells, and both naive and memory B cells also were positive for activation markers for at least 30 days after infection. Activated interferon-gamma+ T cells, natural killer cells, TNF-alpha+ monocytes, IL-10+, TNF-alpha+ myeloid dendritic cells, and TNF plasmacytoid dendritic cells also persisted for at least 30 days. Cross-reactive GII immunity was evident for at least 180 days. “GII.2 infection boosted cross-reactive blocking antibodies to GII.3, GII.14, and GII.17, as well as T-cell responses to GII.4, despite the lack of clear serologic evidence of previous GII.4 exposure,” the investigators wrote.
Based on prior reports that bile enhances norovirus growth or ligand binding, they inoculated specimens with chenodeoxycholic acid (CDCA) and glycochenodeoxycholic acid (GCDCA), pig bile, ox bile, or human bile. “Strikingly, the addition of bile enabled GII.2 Chapel Hill outbreak virus-like particle to bind to saliva from the four nonsecretor donors,” the researchers wrote. Bile acids “may override the genetic advantage of less-diverse HBGA expression in nonsecretors by improving the avidity of GII.2 binding to nonsecretor HBGAs, potentially paving the way for infection.” However, bile salts did not enable the GII.2 strain to replicate in human intestinal enteroid cells, which suggests that additional factors play into how norovirus enters human cells, according to the researchers.
The findings, they wrote, “support development of within-genogroup, cross-reactive antibody and T-cell immunity, key outcomes that may provide the foundation for eliciting broad immune responses after GII.4 vaccination in individuals with limited GII.4 immunity, including young children.”
The National Institutes of Health, the Wellcome Trust, the Centers for Disease Control and Prevention, and a Cancer Center Core support provided funding. Ms. Lindesmith and her associates reported having no relevant conflicts of interest.
SOURCE: Lindesmith LC et al. Cell Molec Gastroenterol Hepatol. 2020;10:245-67.
FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY
Real-time, computer-aided detection system significantly improved adenoma detection
A real-time, computer-aided detection system using artificial intelligence significantly improved adenoma detection during high-definition colonoscopy in a multicenter, randomized clinical trial.
The adenoma detection rate was 55% in the intervention group and 40% in the control group, Alessandro Repici, MD, PhD, and his associates wrote in Gastroenterology. Improved detection of smaller adenomas explained the difference. After age, sex, and indication for colonoscopy were controlled for, computer-aided detection (CADe) increased the probability of adenoma detection by 30% (risk ratio, 1.30; 95% confidence interval, 1.14-1.45).
The CADe system did not increase the likelihood of resecting non-neoplastic lesions (26% versus 29% in the control group), said Dr. Repici, of Humanitas Research Hospital in Milano, Italy. “The per-protocol analysis produced similar results,” he and his associates wrote. “The substantial improvement for adenoma detection rate and mean number of adenomas per colonoscopy, without increasing the removal of nonneoplastic lesions, is likely to improve the quality of colonoscopy without affecting its efficiency.”
Screening colonoscopies miss about 25% of adenomas, increasing patients’ risk for colorectal cancer. Although real-time CADe systems can identify colorectal neoplasias, comprehensive studies of the effect of CADe systems on adenoma detection and other colonoscopy quality measures are lacking.
The study included 685 adults from three centers in Italy who underwent screening colonoscopies for colorectal cancer, postpolypectomy surveillance, or workup based on a positive fecal immunochemical test or signs and symptoms of colorectal cancer. Patients were randomly assigned on a one-to-one basis to receive high-definition colonoscopies with or without the CADe system, which consists of an artificial intelligence–based medical device (GI Genius, Medtronic) that processes colonoscopy images in real time and superimposes a green box over suspected lesions. Six experienced endoscopists performed the colonoscopies; the minimum withdrawal time was 6 minutes, and histopathology was the reference standard.
The average number of adenomas detected per colonoscopy was 1.1 (standard deviation, 0.5) in the CADe group and 0.7 (SD, 1.2) in the control group, for an incidence rate ratio of 1.46 (95% CI, 1.15-1.86). The CADe system also significantly improved the detection of adenomas measuring 5 mm or less (34% vs. 27% in the control group; RR, 1.26; 95% CI, 1.01-1.52) and adenomas measuring 6-9 mm (11% vs. 6%, respectively; RR, 1.78; 95% CI, 1.09-2.86). Detection of larger adenomas did not significantly differ between groups. These findings did not vary based on adenoma morphology (polypoid or nonpolypoid) or location (proximal or distal colon), the researchers said.
Detection of multiple adenomas also was higher in the intervention group than in the control group (23% vs. 15%, respectively; RR, 1.50; 95% CI, 1.19-1.95). There were no significant differences in the detection of sessile serrated lesions (7% and 5%) and nonneoplastic lesions (20% and 17%). Average withdrawal times did not significantly differ between groups (417 seconds for CADe and 435 seconds for the control group).
The CADe system is a convolutional neural network that was trained and validated using a series of more than 2,600 histologically confirmed polyps from 840 participants in a prior clinical trial (Gastroenterology 2019;156:2198-207.e1). The system takes an average of 1.5 microseconds to output processed images.
“The addition of real-time CADe to colonoscopy resulted in a 30% and 46% relative increase in adenoma detection rate and the average number of adenomas detected per colonoscopy, demonstrating its efficacy in improving the detection of colorectal neoplasia at screening and diagnostic colonoscopy,” the investigators wrote. “[The s]afety of CADe was demonstrated by the lack of increase of both useless resections and withdrawal time, as well as by the exclusion of any underskilling in the study period.”
Medtronic loaned the equipment for the study. Dr. Repici and the senior author disclosed consulting fees from Medtronic.
SOURCE: Repici A et al. Gastroenterology. 2020 May 3. doi: 10.1053/j.gastro.2020.04.062.
A real-time, computer-aided detection system using artificial intelligence significantly improved adenoma detection during high-definition colonoscopy in a multicenter, randomized clinical trial.
The adenoma detection rate was 55% in the intervention group and 40% in the control group, Alessandro Repici, MD, PhD, and his associates wrote in Gastroenterology. Improved detection of smaller adenomas explained the difference. After age, sex, and indication for colonoscopy were controlled for, computer-aided detection (CADe) increased the probability of adenoma detection by 30% (risk ratio, 1.30; 95% confidence interval, 1.14-1.45).
The CADe system did not increase the likelihood of resecting non-neoplastic lesions (26% versus 29% in the control group), said Dr. Repici, of Humanitas Research Hospital in Milano, Italy. “The per-protocol analysis produced similar results,” he and his associates wrote. “The substantial improvement for adenoma detection rate and mean number of adenomas per colonoscopy, without increasing the removal of nonneoplastic lesions, is likely to improve the quality of colonoscopy without affecting its efficiency.”
Screening colonoscopies miss about 25% of adenomas, increasing patients’ risk for colorectal cancer. Although real-time CADe systems can identify colorectal neoplasias, comprehensive studies of the effect of CADe systems on adenoma detection and other colonoscopy quality measures are lacking.
The study included 685 adults from three centers in Italy who underwent screening colonoscopies for colorectal cancer, postpolypectomy surveillance, or workup based on a positive fecal immunochemical test or signs and symptoms of colorectal cancer. Patients were randomly assigned on a one-to-one basis to receive high-definition colonoscopies with or without the CADe system, which consists of an artificial intelligence–based medical device (GI Genius, Medtronic) that processes colonoscopy images in real time and superimposes a green box over suspected lesions. Six experienced endoscopists performed the colonoscopies; the minimum withdrawal time was 6 minutes, and histopathology was the reference standard.
The average number of adenomas detected per colonoscopy was 1.1 (standard deviation, 0.5) in the CADe group and 0.7 (SD, 1.2) in the control group, for an incidence rate ratio of 1.46 (95% CI, 1.15-1.86). The CADe system also significantly improved the detection of adenomas measuring 5 mm or less (34% vs. 27% in the control group; RR, 1.26; 95% CI, 1.01-1.52) and adenomas measuring 6-9 mm (11% vs. 6%, respectively; RR, 1.78; 95% CI, 1.09-2.86). Detection of larger adenomas did not significantly differ between groups. These findings did not vary based on adenoma morphology (polypoid or nonpolypoid) or location (proximal or distal colon), the researchers said.
Detection of multiple adenomas also was higher in the intervention group than in the control group (23% vs. 15%, respectively; RR, 1.50; 95% CI, 1.19-1.95). There were no significant differences in the detection of sessile serrated lesions (7% and 5%) and nonneoplastic lesions (20% and 17%). Average withdrawal times did not significantly differ between groups (417 seconds for CADe and 435 seconds for the control group).
The CADe system is a convolutional neural network that was trained and validated using a series of more than 2,600 histologically confirmed polyps from 840 participants in a prior clinical trial (Gastroenterology 2019;156:2198-207.e1). The system takes an average of 1.5 microseconds to output processed images.
“The addition of real-time CADe to colonoscopy resulted in a 30% and 46% relative increase in adenoma detection rate and the average number of adenomas detected per colonoscopy, demonstrating its efficacy in improving the detection of colorectal neoplasia at screening and diagnostic colonoscopy,” the investigators wrote. “[The s]afety of CADe was demonstrated by the lack of increase of both useless resections and withdrawal time, as well as by the exclusion of any underskilling in the study period.”
Medtronic loaned the equipment for the study. Dr. Repici and the senior author disclosed consulting fees from Medtronic.
SOURCE: Repici A et al. Gastroenterology. 2020 May 3. doi: 10.1053/j.gastro.2020.04.062.
A real-time, computer-aided detection system using artificial intelligence significantly improved adenoma detection during high-definition colonoscopy in a multicenter, randomized clinical trial.
The adenoma detection rate was 55% in the intervention group and 40% in the control group, Alessandro Repici, MD, PhD, and his associates wrote in Gastroenterology. Improved detection of smaller adenomas explained the difference. After age, sex, and indication for colonoscopy were controlled for, computer-aided detection (CADe) increased the probability of adenoma detection by 30% (risk ratio, 1.30; 95% confidence interval, 1.14-1.45).
The CADe system did not increase the likelihood of resecting non-neoplastic lesions (26% versus 29% in the control group), said Dr. Repici, of Humanitas Research Hospital in Milano, Italy. “The per-protocol analysis produced similar results,” he and his associates wrote. “The substantial improvement for adenoma detection rate and mean number of adenomas per colonoscopy, without increasing the removal of nonneoplastic lesions, is likely to improve the quality of colonoscopy without affecting its efficiency.”
Screening colonoscopies miss about 25% of adenomas, increasing patients’ risk for colorectal cancer. Although real-time CADe systems can identify colorectal neoplasias, comprehensive studies of the effect of CADe systems on adenoma detection and other colonoscopy quality measures are lacking.
The study included 685 adults from three centers in Italy who underwent screening colonoscopies for colorectal cancer, postpolypectomy surveillance, or workup based on a positive fecal immunochemical test or signs and symptoms of colorectal cancer. Patients were randomly assigned on a one-to-one basis to receive high-definition colonoscopies with or without the CADe system, which consists of an artificial intelligence–based medical device (GI Genius, Medtronic) that processes colonoscopy images in real time and superimposes a green box over suspected lesions. Six experienced endoscopists performed the colonoscopies; the minimum withdrawal time was 6 minutes, and histopathology was the reference standard.
The average number of adenomas detected per colonoscopy was 1.1 (standard deviation, 0.5) in the CADe group and 0.7 (SD, 1.2) in the control group, for an incidence rate ratio of 1.46 (95% CI, 1.15-1.86). The CADe system also significantly improved the detection of adenomas measuring 5 mm or less (34% vs. 27% in the control group; RR, 1.26; 95% CI, 1.01-1.52) and adenomas measuring 6-9 mm (11% vs. 6%, respectively; RR, 1.78; 95% CI, 1.09-2.86). Detection of larger adenomas did not significantly differ between groups. These findings did not vary based on adenoma morphology (polypoid or nonpolypoid) or location (proximal or distal colon), the researchers said.
Detection of multiple adenomas also was higher in the intervention group than in the control group (23% vs. 15%, respectively; RR, 1.50; 95% CI, 1.19-1.95). There were no significant differences in the detection of sessile serrated lesions (7% and 5%) and nonneoplastic lesions (20% and 17%). Average withdrawal times did not significantly differ between groups (417 seconds for CADe and 435 seconds for the control group).
The CADe system is a convolutional neural network that was trained and validated using a series of more than 2,600 histologically confirmed polyps from 840 participants in a prior clinical trial (Gastroenterology 2019;156:2198-207.e1). The system takes an average of 1.5 microseconds to output processed images.
“The addition of real-time CADe to colonoscopy resulted in a 30% and 46% relative increase in adenoma detection rate and the average number of adenomas detected per colonoscopy, demonstrating its efficacy in improving the detection of colorectal neoplasia at screening and diagnostic colonoscopy,” the investigators wrote. “[The s]afety of CADe was demonstrated by the lack of increase of both useless resections and withdrawal time, as well as by the exclusion of any underskilling in the study period.”
Medtronic loaned the equipment for the study. Dr. Repici and the senior author disclosed consulting fees from Medtronic.
SOURCE: Repici A et al. Gastroenterology. 2020 May 3. doi: 10.1053/j.gastro.2020.04.062.
FROM GASTROENTEROLOGY
High-def chromoendoscopy better for detecting dysplasias in patients with IBD
High-definition chromoendoscopy significantly outperformed high-definition white-light endoscopy for detecting dysplastic lesions in patients with inflammatory bowel disease, according to the findings of a single-center prospective randomized trial.
In the intention-to-diagnose analysis, rates of dysplasia detection were 11% for high-definition chromoendoscopy and 5% for high-definition white-light endoscopy (P = .032). The per-protocol analysis produced a similar result (12% vs. 5%, respectively; P = .027). High-definition chromoendoscopy also detected significantly more dysplastic lesions per 10 minutes of colonoscope withdrawal time in the per-protocol analysis, although the difference did not reach statistical significance in the intention-to-diagnose analysis.
Overall, the findings “support the use of chromoendoscopy for surveillance of patients with inflammatory bowel diseases,” Bjarki Alexandersson, a PhD student at Karolinska University Hospital in Solna, Stockholm, and his associates wrote in Clinical Gastroenterology and Hepatology. Patients with inflammatory bowel disease are at increased risk for colorectal cancer. Most guidelines support chromoendoscopy for the surveillance of these patients, as do the results of two recent meta-analyses in which chromoendoscopy detected significantly more dysplasias among patients with inflammatory bowel disease than did white light endoscopy. However, in subgroup analyses of these studies, the difference only emerged when comparing chromoendoscopy with standard (not high-definition) white-light endoscopy. “Thus, the evidence in support of chromoendoscopy using high-definition endoscopes is weak,” the researchers wrote.
For the study, they prospectively enrolled 305 patients with ulcerative colitis or Crohn’s disease who were referred for surveillance colonoscopy at an academic hospital in Sweden from March 2011 through April 2016. Participants were randomly assigned to receive either high-definition chromoendoscopy with indigo carmine (152 patients) or high-definition white-light endoscopy (153 patients).
In the intention-to-diagnose analysis, dysplasias were detected in 17 (11%) patients evaluated by high-definition chromoendoscopy, compared with 7 (5%) patients evaluated by high-definition white-light endoscopy (P = .032). After excluding 20 patients for inadequate bowel preparation, 18 patients for protocol violations, and 4 patients for incomplete colonoscopies, the per-protocol population consisted of 263 patients. Dysplasias were detected in 12% of patients evaluated by high-definition chromoendoscopy and in 5% of those evaluated by high-definition white-light endoscopy (P = .027).
All patients also had 32 samples collected by random biopsy, which used to be standard for detecting dysplasia in inflammatory bowel disease but has become more controversial in the era of video endoscopy, the researchers noted. In all, random biopsy evaluation identified dysplasias in nine patients, including six in the high-definition chromoendoscopy group and three in the high-definition white-light endoscopy group. Random biopsies were low-yield, identifying dysplasias in 0.092% of all specimens and 3% of colonoscopies. However, 20% of patients with dysplasias were identified only through random biopsy. This finding resembles that of another recent randomized trial in which 13% of patients with inflammatory bowel disease had dysplasias detected only through random biopsy (Gut. 2018;67:616-24), the researchers noted.
They also evaluated the number of macroscopic dysplastic lesions identified for every 10 minutes of colonoscope withdrawal time. These numbers were not significantly different in the intention-to-diagnose analysis (0.066 lesions in the high-definition chromoendoscopy group vs. 0.027 lesions in the high-definition white-light endoscopy group; P = .056). However, the per-protocol analysis revealed a significant difference (0.073 vs. 0.029 dysplastic lesions, respectively; P = .031).
“Based on our findings, we recommend the use of high-definition chromoendoscopy in inflammatory bowel disease surveillance,” the researchers concluded. They acknowledged several limitations: the study included patients from only one center, most dyplastic lesions were small and, thus, had an unclear natural history, and the endoscopists included both experts and nonexperts.
The Stockholm City Council provided funding. The researchers reported having no conflicts of interest.
SOURCE: Alexandersson B et al. Clin Gastroenterol Hepatol. 2020 Apr 27. doi: 10.1016/j.cgh.2020.04.049.
High-definition chromoendoscopy significantly outperformed high-definition white-light endoscopy for detecting dysplastic lesions in patients with inflammatory bowel disease, according to the findings of a single-center prospective randomized trial.
In the intention-to-diagnose analysis, rates of dysplasia detection were 11% for high-definition chromoendoscopy and 5% for high-definition white-light endoscopy (P = .032). The per-protocol analysis produced a similar result (12% vs. 5%, respectively; P = .027). High-definition chromoendoscopy also detected significantly more dysplastic lesions per 10 minutes of colonoscope withdrawal time in the per-protocol analysis, although the difference did not reach statistical significance in the intention-to-diagnose analysis.
Overall, the findings “support the use of chromoendoscopy for surveillance of patients with inflammatory bowel diseases,” Bjarki Alexandersson, a PhD student at Karolinska University Hospital in Solna, Stockholm, and his associates wrote in Clinical Gastroenterology and Hepatology. Patients with inflammatory bowel disease are at increased risk for colorectal cancer. Most guidelines support chromoendoscopy for the surveillance of these patients, as do the results of two recent meta-analyses in which chromoendoscopy detected significantly more dysplasias among patients with inflammatory bowel disease than did white light endoscopy. However, in subgroup analyses of these studies, the difference only emerged when comparing chromoendoscopy with standard (not high-definition) white-light endoscopy. “Thus, the evidence in support of chromoendoscopy using high-definition endoscopes is weak,” the researchers wrote.
For the study, they prospectively enrolled 305 patients with ulcerative colitis or Crohn’s disease who were referred for surveillance colonoscopy at an academic hospital in Sweden from March 2011 through April 2016. Participants were randomly assigned to receive either high-definition chromoendoscopy with indigo carmine (152 patients) or high-definition white-light endoscopy (153 patients).
In the intention-to-diagnose analysis, dysplasias were detected in 17 (11%) patients evaluated by high-definition chromoendoscopy, compared with 7 (5%) patients evaluated by high-definition white-light endoscopy (P = .032). After excluding 20 patients for inadequate bowel preparation, 18 patients for protocol violations, and 4 patients for incomplete colonoscopies, the per-protocol population consisted of 263 patients. Dysplasias were detected in 12% of patients evaluated by high-definition chromoendoscopy and in 5% of those evaluated by high-definition white-light endoscopy (P = .027).
All patients also had 32 samples collected by random biopsy, which used to be standard for detecting dysplasia in inflammatory bowel disease but has become more controversial in the era of video endoscopy, the researchers noted. In all, random biopsy evaluation identified dysplasias in nine patients, including six in the high-definition chromoendoscopy group and three in the high-definition white-light endoscopy group. Random biopsies were low-yield, identifying dysplasias in 0.092% of all specimens and 3% of colonoscopies. However, 20% of patients with dysplasias were identified only through random biopsy. This finding resembles that of another recent randomized trial in which 13% of patients with inflammatory bowel disease had dysplasias detected only through random biopsy (Gut. 2018;67:616-24), the researchers noted.
They also evaluated the number of macroscopic dysplastic lesions identified for every 10 minutes of colonoscope withdrawal time. These numbers were not significantly different in the intention-to-diagnose analysis (0.066 lesions in the high-definition chromoendoscopy group vs. 0.027 lesions in the high-definition white-light endoscopy group; P = .056). However, the per-protocol analysis revealed a significant difference (0.073 vs. 0.029 dysplastic lesions, respectively; P = .031).
“Based on our findings, we recommend the use of high-definition chromoendoscopy in inflammatory bowel disease surveillance,” the researchers concluded. They acknowledged several limitations: the study included patients from only one center, most dyplastic lesions were small and, thus, had an unclear natural history, and the endoscopists included both experts and nonexperts.
The Stockholm City Council provided funding. The researchers reported having no conflicts of interest.
SOURCE: Alexandersson B et al. Clin Gastroenterol Hepatol. 2020 Apr 27. doi: 10.1016/j.cgh.2020.04.049.
High-definition chromoendoscopy significantly outperformed high-definition white-light endoscopy for detecting dysplastic lesions in patients with inflammatory bowel disease, according to the findings of a single-center prospective randomized trial.
In the intention-to-diagnose analysis, rates of dysplasia detection were 11% for high-definition chromoendoscopy and 5% for high-definition white-light endoscopy (P = .032). The per-protocol analysis produced a similar result (12% vs. 5%, respectively; P = .027). High-definition chromoendoscopy also detected significantly more dysplastic lesions per 10 minutes of colonoscope withdrawal time in the per-protocol analysis, although the difference did not reach statistical significance in the intention-to-diagnose analysis.
Overall, the findings “support the use of chromoendoscopy for surveillance of patients with inflammatory bowel diseases,” Bjarki Alexandersson, a PhD student at Karolinska University Hospital in Solna, Stockholm, and his associates wrote in Clinical Gastroenterology and Hepatology. Patients with inflammatory bowel disease are at increased risk for colorectal cancer. Most guidelines support chromoendoscopy for the surveillance of these patients, as do the results of two recent meta-analyses in which chromoendoscopy detected significantly more dysplasias among patients with inflammatory bowel disease than did white light endoscopy. However, in subgroup analyses of these studies, the difference only emerged when comparing chromoendoscopy with standard (not high-definition) white-light endoscopy. “Thus, the evidence in support of chromoendoscopy using high-definition endoscopes is weak,” the researchers wrote.
For the study, they prospectively enrolled 305 patients with ulcerative colitis or Crohn’s disease who were referred for surveillance colonoscopy at an academic hospital in Sweden from March 2011 through April 2016. Participants were randomly assigned to receive either high-definition chromoendoscopy with indigo carmine (152 patients) or high-definition white-light endoscopy (153 patients).
In the intention-to-diagnose analysis, dysplasias were detected in 17 (11%) patients evaluated by high-definition chromoendoscopy, compared with 7 (5%) patients evaluated by high-definition white-light endoscopy (P = .032). After excluding 20 patients for inadequate bowel preparation, 18 patients for protocol violations, and 4 patients for incomplete colonoscopies, the per-protocol population consisted of 263 patients. Dysplasias were detected in 12% of patients evaluated by high-definition chromoendoscopy and in 5% of those evaluated by high-definition white-light endoscopy (P = .027).
All patients also had 32 samples collected by random biopsy, which used to be standard for detecting dysplasia in inflammatory bowel disease but has become more controversial in the era of video endoscopy, the researchers noted. In all, random biopsy evaluation identified dysplasias in nine patients, including six in the high-definition chromoendoscopy group and three in the high-definition white-light endoscopy group. Random biopsies were low-yield, identifying dysplasias in 0.092% of all specimens and 3% of colonoscopies. However, 20% of patients with dysplasias were identified only through random biopsy. This finding resembles that of another recent randomized trial in which 13% of patients with inflammatory bowel disease had dysplasias detected only through random biopsy (Gut. 2018;67:616-24), the researchers noted.
They also evaluated the number of macroscopic dysplastic lesions identified for every 10 minutes of colonoscope withdrawal time. These numbers were not significantly different in the intention-to-diagnose analysis (0.066 lesions in the high-definition chromoendoscopy group vs. 0.027 lesions in the high-definition white-light endoscopy group; P = .056). However, the per-protocol analysis revealed a significant difference (0.073 vs. 0.029 dysplastic lesions, respectively; P = .031).
“Based on our findings, we recommend the use of high-definition chromoendoscopy in inflammatory bowel disease surveillance,” the researchers concluded. They acknowledged several limitations: the study included patients from only one center, most dyplastic lesions were small and, thus, had an unclear natural history, and the endoscopists included both experts and nonexperts.
The Stockholm City Council provided funding. The researchers reported having no conflicts of interest.
SOURCE: Alexandersson B et al. Clin Gastroenterol Hepatol. 2020 Apr 27. doi: 10.1016/j.cgh.2020.04.049.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Biopsies of ascending, descending colon alone detected microscopic colitis
All patients with microscopic colitis who had biopsies of both the ascending and descending colon had positive slide review for at least one of the two sites, according to the findings of a single-center retrospective study.
“Microscopic colitis can be detected with 100% sensitivity by analyzing biopsy specimens from the ascending and descending colon. We propose a Western protocol (taking two biopsy specimens each from the ascending colon and the descending colon) in the evaluation of patients for microscopic colitis,” wrote Boris Virine, MD, of London (Ont.) Health Sciences Centre, Western University, together with his associates in Clinical Gastroenterology and Hepatology.
That is half the minimum number of samples recommended by current guidelines, the researchers noted. “The American Society for Gastrointestinal Endoscopy recommends two or more biopsy specimens from the right, transverse, left, and sigmoid colons; however, these recommendations were based on expert opinion rather than scientific evidence, and these guidelines have not been validated,” they wrote.
Microscopic colitis includes lymphocytic and collagenous subtypes, neither of which is grossly apparent on colonoscopy. “Endoscopists therefore often collect multiple random colonic biopsies, potentially oversampling, increasing times of colonoscopy and slide review,” Dr. Virine and his associates wrote.
To better pinpoint optimal biopsy sites and specimen numbers, they studied 101 patients consecutively diagnosed with biopsy-confirmed microscopic colitis at London Health Sciences Centre from 2017 through 2018. Patients with other colonic diseases were excluded from the study. Dr. Virine assessed all individual biopsy fragments, and another pathologist performed a second review of complex cases.
A total of 52 patients had biopsy-confirmed collagenous colitis – that is, normal crypt architecture, increased mononuclear inflammatory cells in the lamina propria, and a thickened subepithelial collagen band. Forty-two patients had lymphocytic colitis, defined as normal crypt architecture, increased mononuclear inflammatory cells in the lamina propria, and increased intraepithelial lymphocytosis. Seven patients had both disease subtypes.
For each patient, an average of nine (standard deviation, 4.9) biopsies had been collected. The most commonly sampled site was the ascending colon (biopsied in 47% of patients in whom at least one sample was labeled by site), followed by the descending colon (40%), rectum (21%), transverse colon (20%), sigmoid colon (15%), cecum (8%), and splenic and hepatic flexures (2% each). Diagnostic sensitivity was highest for the ascending colon (97%), transverse colon (96%), and sigmoid colon (91%) and lowest for the splenic flexure (75%), hepatic flexure (78%), and rectum (82%). The diagnostic sensitivity of the descending colon was 85%. However, all 39 patients with biopsies of both the ascending and descending colon had at least one biopsy that was positive for microscopic colitis (sensitivity, 100%).
“Based on the results of our study, collecting biopsy specimens from both the ascending and descending colons has the same overall sensitivity as following the guidelines,” the researchers concluded. “Because no single site in the colon was diffusely positive for microscopic colitis in 100% of cases, the possibility remains that collecting biopsy specimens from two sites could offer comparable sensitivity with biopsy specimens from each segment of the colon.”
No funding sources were reported. Dr. Virine and the senior author reported having no conflicts of interest. One coauthor disclosed ties to AbbVie, Allergan, Ferring, Janssen, Lupin Pendopharm, Pfizer, Shire, and Takeda.
SOURCE: Virine B et al. Clin Gastroenterol Hepatol. 2020 Feb 25. doi: 10.1016/j.cgh.2020.02.036.
Microscopic colitis is a common cause of watery diarrhea. This debilitating disease is easy to treat, but the diagnosis can be challenging. Without lesions to target, guidelines recommend colonoscopy with at least two biopsies from the right, transverse, descending, and sigmoid colon (total: eight-plus biopsies). With little evidence to guide this recommendation, this time-consuming protocol was proposed to minimize the risk of false-negative results.
This study by Virine and colleagues again demonstrates that microscopic colitis is a patchy colonic disease and that biopsy yield varies by anatomic location. Most importantly, the authors determined that a colonoscopy with two biopsies from the ascending and two biopsies from the descending colon (total: four biopsies) detects all patients with microscopic colitis. Biopsies of the rectosigmoid alone were insufficient. This work suggests that we can rule out a diagnosis of microscopic colitis by taking at least 50% fewer biopsies.
A more efficient and less invasive procedure is better for patients as sedation time and sampling the colon are associated with risks. In the future, a prospective, colonoscopy-based study in patients with diarrhea will allow us to confirm the optimal number and location of biopsies needed to establish a diagnosis of microscopic colitis. This work will be important to inform diagnostic guidelines and change practice.
Anne F. Peery, MD, MSCR, is assistant professor of medicine, division of gastroenterology and hepatology, University of North Carolina School of Medicine, Chapel Hill. She has no conflicts of interest.
Microscopic colitis is a common cause of watery diarrhea. This debilitating disease is easy to treat, but the diagnosis can be challenging. Without lesions to target, guidelines recommend colonoscopy with at least two biopsies from the right, transverse, descending, and sigmoid colon (total: eight-plus biopsies). With little evidence to guide this recommendation, this time-consuming protocol was proposed to minimize the risk of false-negative results.
This study by Virine and colleagues again demonstrates that microscopic colitis is a patchy colonic disease and that biopsy yield varies by anatomic location. Most importantly, the authors determined that a colonoscopy with two biopsies from the ascending and two biopsies from the descending colon (total: four biopsies) detects all patients with microscopic colitis. Biopsies of the rectosigmoid alone were insufficient. This work suggests that we can rule out a diagnosis of microscopic colitis by taking at least 50% fewer biopsies.
A more efficient and less invasive procedure is better for patients as sedation time and sampling the colon are associated with risks. In the future, a prospective, colonoscopy-based study in patients with diarrhea will allow us to confirm the optimal number and location of biopsies needed to establish a diagnosis of microscopic colitis. This work will be important to inform diagnostic guidelines and change practice.
Anne F. Peery, MD, MSCR, is assistant professor of medicine, division of gastroenterology and hepatology, University of North Carolina School of Medicine, Chapel Hill. She has no conflicts of interest.
Microscopic colitis is a common cause of watery diarrhea. This debilitating disease is easy to treat, but the diagnosis can be challenging. Without lesions to target, guidelines recommend colonoscopy with at least two biopsies from the right, transverse, descending, and sigmoid colon (total: eight-plus biopsies). With little evidence to guide this recommendation, this time-consuming protocol was proposed to minimize the risk of false-negative results.
This study by Virine and colleagues again demonstrates that microscopic colitis is a patchy colonic disease and that biopsy yield varies by anatomic location. Most importantly, the authors determined that a colonoscopy with two biopsies from the ascending and two biopsies from the descending colon (total: four biopsies) detects all patients with microscopic colitis. Biopsies of the rectosigmoid alone were insufficient. This work suggests that we can rule out a diagnosis of microscopic colitis by taking at least 50% fewer biopsies.
A more efficient and less invasive procedure is better for patients as sedation time and sampling the colon are associated with risks. In the future, a prospective, colonoscopy-based study in patients with diarrhea will allow us to confirm the optimal number and location of biopsies needed to establish a diagnosis of microscopic colitis. This work will be important to inform diagnostic guidelines and change practice.
Anne F. Peery, MD, MSCR, is assistant professor of medicine, division of gastroenterology and hepatology, University of North Carolina School of Medicine, Chapel Hill. She has no conflicts of interest.
All patients with microscopic colitis who had biopsies of both the ascending and descending colon had positive slide review for at least one of the two sites, according to the findings of a single-center retrospective study.
“Microscopic colitis can be detected with 100% sensitivity by analyzing biopsy specimens from the ascending and descending colon. We propose a Western protocol (taking two biopsy specimens each from the ascending colon and the descending colon) in the evaluation of patients for microscopic colitis,” wrote Boris Virine, MD, of London (Ont.) Health Sciences Centre, Western University, together with his associates in Clinical Gastroenterology and Hepatology.
That is half the minimum number of samples recommended by current guidelines, the researchers noted. “The American Society for Gastrointestinal Endoscopy recommends two or more biopsy specimens from the right, transverse, left, and sigmoid colons; however, these recommendations were based on expert opinion rather than scientific evidence, and these guidelines have not been validated,” they wrote.
Microscopic colitis includes lymphocytic and collagenous subtypes, neither of which is grossly apparent on colonoscopy. “Endoscopists therefore often collect multiple random colonic biopsies, potentially oversampling, increasing times of colonoscopy and slide review,” Dr. Virine and his associates wrote.
To better pinpoint optimal biopsy sites and specimen numbers, they studied 101 patients consecutively diagnosed with biopsy-confirmed microscopic colitis at London Health Sciences Centre from 2017 through 2018. Patients with other colonic diseases were excluded from the study. Dr. Virine assessed all individual biopsy fragments, and another pathologist performed a second review of complex cases.
A total of 52 patients had biopsy-confirmed collagenous colitis – that is, normal crypt architecture, increased mononuclear inflammatory cells in the lamina propria, and a thickened subepithelial collagen band. Forty-two patients had lymphocytic colitis, defined as normal crypt architecture, increased mononuclear inflammatory cells in the lamina propria, and increased intraepithelial lymphocytosis. Seven patients had both disease subtypes.
For each patient, an average of nine (standard deviation, 4.9) biopsies had been collected. The most commonly sampled site was the ascending colon (biopsied in 47% of patients in whom at least one sample was labeled by site), followed by the descending colon (40%), rectum (21%), transverse colon (20%), sigmoid colon (15%), cecum (8%), and splenic and hepatic flexures (2% each). Diagnostic sensitivity was highest for the ascending colon (97%), transverse colon (96%), and sigmoid colon (91%) and lowest for the splenic flexure (75%), hepatic flexure (78%), and rectum (82%). The diagnostic sensitivity of the descending colon was 85%. However, all 39 patients with biopsies of both the ascending and descending colon had at least one biopsy that was positive for microscopic colitis (sensitivity, 100%).
“Based on the results of our study, collecting biopsy specimens from both the ascending and descending colons has the same overall sensitivity as following the guidelines,” the researchers concluded. “Because no single site in the colon was diffusely positive for microscopic colitis in 100% of cases, the possibility remains that collecting biopsy specimens from two sites could offer comparable sensitivity with biopsy specimens from each segment of the colon.”
No funding sources were reported. Dr. Virine and the senior author reported having no conflicts of interest. One coauthor disclosed ties to AbbVie, Allergan, Ferring, Janssen, Lupin Pendopharm, Pfizer, Shire, and Takeda.
SOURCE: Virine B et al. Clin Gastroenterol Hepatol. 2020 Feb 25. doi: 10.1016/j.cgh.2020.02.036.
All patients with microscopic colitis who had biopsies of both the ascending and descending colon had positive slide review for at least one of the two sites, according to the findings of a single-center retrospective study.
“Microscopic colitis can be detected with 100% sensitivity by analyzing biopsy specimens from the ascending and descending colon. We propose a Western protocol (taking two biopsy specimens each from the ascending colon and the descending colon) in the evaluation of patients for microscopic colitis,” wrote Boris Virine, MD, of London (Ont.) Health Sciences Centre, Western University, together with his associates in Clinical Gastroenterology and Hepatology.
That is half the minimum number of samples recommended by current guidelines, the researchers noted. “The American Society for Gastrointestinal Endoscopy recommends two or more biopsy specimens from the right, transverse, left, and sigmoid colons; however, these recommendations were based on expert opinion rather than scientific evidence, and these guidelines have not been validated,” they wrote.
Microscopic colitis includes lymphocytic and collagenous subtypes, neither of which is grossly apparent on colonoscopy. “Endoscopists therefore often collect multiple random colonic biopsies, potentially oversampling, increasing times of colonoscopy and slide review,” Dr. Virine and his associates wrote.
To better pinpoint optimal biopsy sites and specimen numbers, they studied 101 patients consecutively diagnosed with biopsy-confirmed microscopic colitis at London Health Sciences Centre from 2017 through 2018. Patients with other colonic diseases were excluded from the study. Dr. Virine assessed all individual biopsy fragments, and another pathologist performed a second review of complex cases.
A total of 52 patients had biopsy-confirmed collagenous colitis – that is, normal crypt architecture, increased mononuclear inflammatory cells in the lamina propria, and a thickened subepithelial collagen band. Forty-two patients had lymphocytic colitis, defined as normal crypt architecture, increased mononuclear inflammatory cells in the lamina propria, and increased intraepithelial lymphocytosis. Seven patients had both disease subtypes.
For each patient, an average of nine (standard deviation, 4.9) biopsies had been collected. The most commonly sampled site was the ascending colon (biopsied in 47% of patients in whom at least one sample was labeled by site), followed by the descending colon (40%), rectum (21%), transverse colon (20%), sigmoid colon (15%), cecum (8%), and splenic and hepatic flexures (2% each). Diagnostic sensitivity was highest for the ascending colon (97%), transverse colon (96%), and sigmoid colon (91%) and lowest for the splenic flexure (75%), hepatic flexure (78%), and rectum (82%). The diagnostic sensitivity of the descending colon was 85%. However, all 39 patients with biopsies of both the ascending and descending colon had at least one biopsy that was positive for microscopic colitis (sensitivity, 100%).
“Based on the results of our study, collecting biopsy specimens from both the ascending and descending colons has the same overall sensitivity as following the guidelines,” the researchers concluded. “Because no single site in the colon was diffusely positive for microscopic colitis in 100% of cases, the possibility remains that collecting biopsy specimens from two sites could offer comparable sensitivity with biopsy specimens from each segment of the colon.”
No funding sources were reported. Dr. Virine and the senior author reported having no conflicts of interest. One coauthor disclosed ties to AbbVie, Allergan, Ferring, Janssen, Lupin Pendopharm, Pfizer, Shire, and Takeda.
SOURCE: Virine B et al. Clin Gastroenterol Hepatol. 2020 Feb 25. doi: 10.1016/j.cgh.2020.02.036.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Combination probiotic formulations might improve outcomes in preterm infants
For preterm, low-birth-weight infants, probiotic formulations containing Lactobacillus and Bifidobacterium strains appear to be superior to single-strain probiotics and to other multiple-strain formulations for reducing the risk of all-cause mortality, according to the findings of a network meta-analysis of randomized clinical trials.
The results of a prior Cochrane review indicated that probiotics can help prevent severe necrotizing enterocolitis and all-cause mortality in preterm infants, but the most effective formulations remained unclear. Therefore, Rebecca L. Morgan, PhD, MPH, and her associates searched MEDLINE, EMBASE, Science Citation Index Expanded, CINAHL, Scopus, Cochrane CENTRAL, BIOSIS Previews, and Google Scholar through Jan. 1, 2019, to identify studies of single-strain and multistrain probiotic formulations in preterm, low-birth-weight neonates. A total of 63 studies involving 15,712 infants met inclusion criteria. “We used a frequentist approach for network meta-analysis and [a] GRADE approach to assess certainty of evidence,” they noted.
“High-certainty” evidence indicated that combination therapy with one or more Lactobacillus species and one or more Bifidobacterium species significantly reduced all-cause mortality, compared with placebo (odds ratio, 0.56; 95% confidence interval, 0.39-0.80), wrote Dr. Morgan, of McMaster University, Hamilton, Canada, and her coinvestigators. This was the only intervention to have moderate- or high-quality evidence for a reduction in mortality, the researchers wrote in Gastroenterology.
They added that, among the probiotic formulations with moderate- or high-quality evidence for efficacy, compared with placebo, those containing at least one species of Lactobacillus and at least one species of Bifidobacterium, and the single-strain probiotics containing Bifidobacterium animalis subspecies lactis, Lactobacillus reuteri, or Lactobacillus rhamnosus significantly reduced the risk of severe necrotizing enterocolitis (Bell stage II or higher), with statistically significant odds ratios of 0.35, 0.31, 0.55, and 0.44, respectively.
Three formulations were associated with “low-” or “very low-certainty” evidence for a reduction in risk for severe necrotizing enterocolitis, compared with placebo: Bacillus plus Enterococcus species, Lactobacillus plus Bifidobacterium plus Enterococcus species and Bifidobacterium plus Streptococcus salivarius subspecies thermophilus. Estimated odds ratios were 0.23 (risk difference, –4.9%), 0.28 (RD, –4.9%), and 0.38 (RD, –3.9%), respectively.
“The combinations of Bacillus species and Enterococcus species, and one or more Bifidobacterium species and S. salivarius subspecies thermophilus, might produce the largest reduction in necrotizing enterocolitis development,” the investigators wrote. “Further trials are needed.”
Compared with placebo, no probiotic formulation significantly improved the third primary outcome in the meta-analysis, culture-confirmed sepsis. However, several formulations were associated with moderate- or high-quality evidence for efficacy on secondary outcome measures. Compared with placebo, combinations of Lactobacillus and Bifidobacterium and Saccharomyces boulardii were associated with a significant decrease in the number of days to reach full feeding (mean reduction, 3.3 days; 95% CI, 5.9-0.7 days). Compared with placebo, single-strain therapy with B. animalis subspecies lactis or Lactobacillus reuteri was associated with a shorter duration of hospitalization, with mean reductions of 13.0 days (95% CI, 22.7-3.3 days) and 7.9 days (95% CI, 11.6-4.2 days), respectively.
“Multicenter and large randomized controlled trials should be prioritized to distinguish between the efficacy of single- and multiple-strain probiotics among preterm infants,” Dr. Morgan and her associates concluded. Such studies would further clarify the safety of probiotic formulations in this “fragile population,” they wrote. “Although the primary concern of live microbe administration, intestinal barrier translocation leading to sepsis, is decreased by several probiotic formulations, sound clinical judgement should be exercised.”
Partial support was provided by Mitacs Canada, in partnership with Nestlé Canada. The funder was not involved in designing or conducting the study or writing the manuscript. Dr. Morgan reported having no relevant conflicts of interest. One coinvestigator disclosed ties to AbbVie, Ferring, Janssen, and Takeda.
SOURCE: Morgan RL et al. Gastroenterology. 2020 Jun 24. doi: 10.1053/j.gastro.2020.05.096.
The demonstration of decreased risks of both death and necrotizing enterocolitis (NEC) in randomized placebo-controlled trials of probiotic microbes in very preterm babies is the most compelling case for administration of probiotics to date. Questions remain, including the optimal probiotic microbe(s) and dose for this population. The ideal studies would compare commercially available probiotic products and doses to each other (rather than to placebo). In the absence of these ideal studies, a network meta-analysis is a valuable tool to compare and rank multiple treatments. One of the drawbacks of a network meta-analysis is the assumption that all interventions have similar effects in all populations (an assumption that is challenging given the marked differences in the incidence of NEC between hospitals and populations).
The study conclusion that the combination of at least one Lactobacillus strain and at least one Bifidobacterium strain is most effective in preventing both death and NEC in very preterm infants is consistent with a previous network meta-analysis and with recent recommendations of the European Society for Paediatric Gastroenterology Hepatology and Nutrition and the American Gastroenterological Association.
Administration of probiotics to very preterm infants remains uncommon in many countries, including the United States. Parents of infants with NEC commonly express frustration at the lack of information about this disease and available preventive strategies. Given an intervention with limited evidence of harm and significant evidence of benefit, it is incumbent upon neonatologists to discuss the available evidence with parents and include their wishes in the decision-making process.
Mark A. Underwood, MD, MAS, is a professor of pediatrics and chief of the division of neonatology in the department of pediatrics at the University of California, Davis. He has received honoraria from Abbott and conducted a clinical trial of probiotics that was funded by Evolve Biosystems.
The demonstration of decreased risks of both death and necrotizing enterocolitis (NEC) in randomized placebo-controlled trials of probiotic microbes in very preterm babies is the most compelling case for administration of probiotics to date. Questions remain, including the optimal probiotic microbe(s) and dose for this population. The ideal studies would compare commercially available probiotic products and doses to each other (rather than to placebo). In the absence of these ideal studies, a network meta-analysis is a valuable tool to compare and rank multiple treatments. One of the drawbacks of a network meta-analysis is the assumption that all interventions have similar effects in all populations (an assumption that is challenging given the marked differences in the incidence of NEC between hospitals and populations).
The study conclusion that the combination of at least one Lactobacillus strain and at least one Bifidobacterium strain is most effective in preventing both death and NEC in very preterm infants is consistent with a previous network meta-analysis and with recent recommendations of the European Society for Paediatric Gastroenterology Hepatology and Nutrition and the American Gastroenterological Association.
Administration of probiotics to very preterm infants remains uncommon in many countries, including the United States. Parents of infants with NEC commonly express frustration at the lack of information about this disease and available preventive strategies. Given an intervention with limited evidence of harm and significant evidence of benefit, it is incumbent upon neonatologists to discuss the available evidence with parents and include their wishes in the decision-making process.
Mark A. Underwood, MD, MAS, is a professor of pediatrics and chief of the division of neonatology in the department of pediatrics at the University of California, Davis. He has received honoraria from Abbott and conducted a clinical trial of probiotics that was funded by Evolve Biosystems.
The demonstration of decreased risks of both death and necrotizing enterocolitis (NEC) in randomized placebo-controlled trials of probiotic microbes in very preterm babies is the most compelling case for administration of probiotics to date. Questions remain, including the optimal probiotic microbe(s) and dose for this population. The ideal studies would compare commercially available probiotic products and doses to each other (rather than to placebo). In the absence of these ideal studies, a network meta-analysis is a valuable tool to compare and rank multiple treatments. One of the drawbacks of a network meta-analysis is the assumption that all interventions have similar effects in all populations (an assumption that is challenging given the marked differences in the incidence of NEC between hospitals and populations).
The study conclusion that the combination of at least one Lactobacillus strain and at least one Bifidobacterium strain is most effective in preventing both death and NEC in very preterm infants is consistent with a previous network meta-analysis and with recent recommendations of the European Society for Paediatric Gastroenterology Hepatology and Nutrition and the American Gastroenterological Association.
Administration of probiotics to very preterm infants remains uncommon in many countries, including the United States. Parents of infants with NEC commonly express frustration at the lack of information about this disease and available preventive strategies. Given an intervention with limited evidence of harm and significant evidence of benefit, it is incumbent upon neonatologists to discuss the available evidence with parents and include their wishes in the decision-making process.
Mark A. Underwood, MD, MAS, is a professor of pediatrics and chief of the division of neonatology in the department of pediatrics at the University of California, Davis. He has received honoraria from Abbott and conducted a clinical trial of probiotics that was funded by Evolve Biosystems.
For preterm, low-birth-weight infants, probiotic formulations containing Lactobacillus and Bifidobacterium strains appear to be superior to single-strain probiotics and to other multiple-strain formulations for reducing the risk of all-cause mortality, according to the findings of a network meta-analysis of randomized clinical trials.
The results of a prior Cochrane review indicated that probiotics can help prevent severe necrotizing enterocolitis and all-cause mortality in preterm infants, but the most effective formulations remained unclear. Therefore, Rebecca L. Morgan, PhD, MPH, and her associates searched MEDLINE, EMBASE, Science Citation Index Expanded, CINAHL, Scopus, Cochrane CENTRAL, BIOSIS Previews, and Google Scholar through Jan. 1, 2019, to identify studies of single-strain and multistrain probiotic formulations in preterm, low-birth-weight neonates. A total of 63 studies involving 15,712 infants met inclusion criteria. “We used a frequentist approach for network meta-analysis and [a] GRADE approach to assess certainty of evidence,” they noted.
“High-certainty” evidence indicated that combination therapy with one or more Lactobacillus species and one or more Bifidobacterium species significantly reduced all-cause mortality, compared with placebo (odds ratio, 0.56; 95% confidence interval, 0.39-0.80), wrote Dr. Morgan, of McMaster University, Hamilton, Canada, and her coinvestigators. This was the only intervention to have moderate- or high-quality evidence for a reduction in mortality, the researchers wrote in Gastroenterology.
They added that, among the probiotic formulations with moderate- or high-quality evidence for efficacy, compared with placebo, those containing at least one species of Lactobacillus and at least one species of Bifidobacterium, and the single-strain probiotics containing Bifidobacterium animalis subspecies lactis, Lactobacillus reuteri, or Lactobacillus rhamnosus significantly reduced the risk of severe necrotizing enterocolitis (Bell stage II or higher), with statistically significant odds ratios of 0.35, 0.31, 0.55, and 0.44, respectively.
Three formulations were associated with “low-” or “very low-certainty” evidence for a reduction in risk for severe necrotizing enterocolitis, compared with placebo: Bacillus plus Enterococcus species, Lactobacillus plus Bifidobacterium plus Enterococcus species and Bifidobacterium plus Streptococcus salivarius subspecies thermophilus. Estimated odds ratios were 0.23 (risk difference, –4.9%), 0.28 (RD, –4.9%), and 0.38 (RD, –3.9%), respectively.
“The combinations of Bacillus species and Enterococcus species, and one or more Bifidobacterium species and S. salivarius subspecies thermophilus, might produce the largest reduction in necrotizing enterocolitis development,” the investigators wrote. “Further trials are needed.”
Compared with placebo, no probiotic formulation significantly improved the third primary outcome in the meta-analysis, culture-confirmed sepsis. However, several formulations were associated with moderate- or high-quality evidence for efficacy on secondary outcome measures. Compared with placebo, combinations of Lactobacillus and Bifidobacterium and Saccharomyces boulardii were associated with a significant decrease in the number of days to reach full feeding (mean reduction, 3.3 days; 95% CI, 5.9-0.7 days). Compared with placebo, single-strain therapy with B. animalis subspecies lactis or Lactobacillus reuteri was associated with a shorter duration of hospitalization, with mean reductions of 13.0 days (95% CI, 22.7-3.3 days) and 7.9 days (95% CI, 11.6-4.2 days), respectively.
“Multicenter and large randomized controlled trials should be prioritized to distinguish between the efficacy of single- and multiple-strain probiotics among preterm infants,” Dr. Morgan and her associates concluded. Such studies would further clarify the safety of probiotic formulations in this “fragile population,” they wrote. “Although the primary concern of live microbe administration, intestinal barrier translocation leading to sepsis, is decreased by several probiotic formulations, sound clinical judgement should be exercised.”
Partial support was provided by Mitacs Canada, in partnership with Nestlé Canada. The funder was not involved in designing or conducting the study or writing the manuscript. Dr. Morgan reported having no relevant conflicts of interest. One coinvestigator disclosed ties to AbbVie, Ferring, Janssen, and Takeda.
SOURCE: Morgan RL et al. Gastroenterology. 2020 Jun 24. doi: 10.1053/j.gastro.2020.05.096.
For preterm, low-birth-weight infants, probiotic formulations containing Lactobacillus and Bifidobacterium strains appear to be superior to single-strain probiotics and to other multiple-strain formulations for reducing the risk of all-cause mortality, according to the findings of a network meta-analysis of randomized clinical trials.
The results of a prior Cochrane review indicated that probiotics can help prevent severe necrotizing enterocolitis and all-cause mortality in preterm infants, but the most effective formulations remained unclear. Therefore, Rebecca L. Morgan, PhD, MPH, and her associates searched MEDLINE, EMBASE, Science Citation Index Expanded, CINAHL, Scopus, Cochrane CENTRAL, BIOSIS Previews, and Google Scholar through Jan. 1, 2019, to identify studies of single-strain and multistrain probiotic formulations in preterm, low-birth-weight neonates. A total of 63 studies involving 15,712 infants met inclusion criteria. “We used a frequentist approach for network meta-analysis and [a] GRADE approach to assess certainty of evidence,” they noted.
“High-certainty” evidence indicated that combination therapy with one or more Lactobacillus species and one or more Bifidobacterium species significantly reduced all-cause mortality, compared with placebo (odds ratio, 0.56; 95% confidence interval, 0.39-0.80), wrote Dr. Morgan, of McMaster University, Hamilton, Canada, and her coinvestigators. This was the only intervention to have moderate- or high-quality evidence for a reduction in mortality, the researchers wrote in Gastroenterology.
They added that, among the probiotic formulations with moderate- or high-quality evidence for efficacy, compared with placebo, those containing at least one species of Lactobacillus and at least one species of Bifidobacterium, and the single-strain probiotics containing Bifidobacterium animalis subspecies lactis, Lactobacillus reuteri, or Lactobacillus rhamnosus significantly reduced the risk of severe necrotizing enterocolitis (Bell stage II or higher), with statistically significant odds ratios of 0.35, 0.31, 0.55, and 0.44, respectively.
Three formulations were associated with “low-” or “very low-certainty” evidence for a reduction in risk for severe necrotizing enterocolitis, compared with placebo: Bacillus plus Enterococcus species, Lactobacillus plus Bifidobacterium plus Enterococcus species and Bifidobacterium plus Streptococcus salivarius subspecies thermophilus. Estimated odds ratios were 0.23 (risk difference, –4.9%), 0.28 (RD, –4.9%), and 0.38 (RD, –3.9%), respectively.
“The combinations of Bacillus species and Enterococcus species, and one or more Bifidobacterium species and S. salivarius subspecies thermophilus, might produce the largest reduction in necrotizing enterocolitis development,” the investigators wrote. “Further trials are needed.”
Compared with placebo, no probiotic formulation significantly improved the third primary outcome in the meta-analysis, culture-confirmed sepsis. However, several formulations were associated with moderate- or high-quality evidence for efficacy on secondary outcome measures. Compared with placebo, combinations of Lactobacillus and Bifidobacterium and Saccharomyces boulardii were associated with a significant decrease in the number of days to reach full feeding (mean reduction, 3.3 days; 95% CI, 5.9-0.7 days). Compared with placebo, single-strain therapy with B. animalis subspecies lactis or Lactobacillus reuteri was associated with a shorter duration of hospitalization, with mean reductions of 13.0 days (95% CI, 22.7-3.3 days) and 7.9 days (95% CI, 11.6-4.2 days), respectively.
“Multicenter and large randomized controlled trials should be prioritized to distinguish between the efficacy of single- and multiple-strain probiotics among preterm infants,” Dr. Morgan and her associates concluded. Such studies would further clarify the safety of probiotic formulations in this “fragile population,” they wrote. “Although the primary concern of live microbe administration, intestinal barrier translocation leading to sepsis, is decreased by several probiotic formulations, sound clinical judgement should be exercised.”
Partial support was provided by Mitacs Canada, in partnership with Nestlé Canada. The funder was not involved in designing or conducting the study or writing the manuscript. Dr. Morgan reported having no relevant conflicts of interest. One coinvestigator disclosed ties to AbbVie, Ferring, Janssen, and Takeda.
SOURCE: Morgan RL et al. Gastroenterology. 2020 Jun 24. doi: 10.1053/j.gastro.2020.05.096.
FROM GASTROENTEROLOGY