Postcolonoscopy colorectal cancers had unique features

Article Type
Changed
Wed, 05/26/2021 - 13:46

Postcolonoscopy colorectal cancers were more likely to arise in the proximal colon and to show microsatellite instability, according to the results of a retrospective population-based study of 168 adults with incident colorectal cancers.

In all, 64% of postcolonoscopy colorectal cancers were located in the proximal colon, compared with 44% of detected colorectal cancers (P = .016), reported Niloy Jewel Samadder, MD, of the University of Utah in Salt Lake City, together with his associates. Furthermore, microsatellite instability (MSI) was detected in 32% of postcolonoscopy colorectal cancers, versus 13% of detected colorectal cancers (P = .005). These findings may point to differences in the underlying biology of postcolonoscopy colorectal cancers and detected colorectal cancers, they said. Studies are needed “to determine if postcolonoscopy cancers arise through a specific genetic pathway that may accelerate neoplastic progression,” they wrote in Clinical Gastroenterology and Hepatology.

Postcolonoscopy colorectal cancers are a “small but clinically important subset of colorectal cancers” that are diagnosed after the patient has a colonoscopy in which no cancer is detected, the researchers noted. These cancers have an estimated global prevalence ranging from 3% to 9% and an estimated pooled prevalence of 3.7% (Am J Gastroenterol. 2014;109:1375-89). Risk factors for postcolonoscopy colorectal cancers include low adenoma detection rates, rural facilities, and care by physicians who are not gastroenterologists. However, tumor-specific and patient-specific factors, including location within the colon and superior survival, compared with detected cancers, raises the possibility of underlying molecular differences related to tumorigenesis, the researchers said.

To investigate this idea, they retrospectively analyzed data from residents of Utah between 50 and 80 years old who had a colonoscopy between, Feb. 15, 1995, and Jan. 31, 2009, at one of two large clinical facilities in Utah (Intermountain Healthcare or the University of Utah Health Sciences). Using a state population-based database, they merged medical information from these patients with cancer histories from the Utah Cancer Registry. This enabled them to compare all 84 postcolonoscopy colorectal cancers (defined as those detected within 6-60 months of colonoscopy) with tissue available for analysis with 84 detected colorectal cancers (detected within 6 months of a colonoscopy).

In the multivariable analysis, MSI was the only molecular feature that was significantly more frequent in postcolonoscopy versus detected colorectal cancers (odds ratio, 4.20; 95% confidence interval, 1.58-11.14). However, postcolonoscopy colorectal cancers were significantly more likely to be early stage (86% versus 69% for detected colorectal cancers; P = .040). Five-year survival did not significantly differ between the groups.

“The molecular signatures of postcolonoscopy colorectal cancers in our study overlap with those of sporadic MSI and serrated pathways, suggesting these mechanisms play a disproportionate role in postcolonoscopy colorectal cancers.” the researchers said. “Additional studies are needed to determine whether these postcolonoscopy colorectal cancers arise through a familial cancer pathway and/or serrated neoplastic pathway of sporadic lesions.

Funders included the American College of Gastroenterology, the National Cancer Institute, the Huntsman Cancer Foundation, the University of Utah, and the Utah Department of Health. Dr. Samadder reported consulting relationships with Cancer Prevention Pharmaceuticals and Janssen Research and Development. The other researchers reported having no conflicts of interest.
 

SOURCE: Samadder NJ et al. Clin Gastroenterol Hepatol. 2019 Mar 28. doi: 10.1016/j.cgh.2019.02.040.

Publications
Topics
Sections

Postcolonoscopy colorectal cancers were more likely to arise in the proximal colon and to show microsatellite instability, according to the results of a retrospective population-based study of 168 adults with incident colorectal cancers.

In all, 64% of postcolonoscopy colorectal cancers were located in the proximal colon, compared with 44% of detected colorectal cancers (P = .016), reported Niloy Jewel Samadder, MD, of the University of Utah in Salt Lake City, together with his associates. Furthermore, microsatellite instability (MSI) was detected in 32% of postcolonoscopy colorectal cancers, versus 13% of detected colorectal cancers (P = .005). These findings may point to differences in the underlying biology of postcolonoscopy colorectal cancers and detected colorectal cancers, they said. Studies are needed “to determine if postcolonoscopy cancers arise through a specific genetic pathway that may accelerate neoplastic progression,” they wrote in Clinical Gastroenterology and Hepatology.

Postcolonoscopy colorectal cancers are a “small but clinically important subset of colorectal cancers” that are diagnosed after the patient has a colonoscopy in which no cancer is detected, the researchers noted. These cancers have an estimated global prevalence ranging from 3% to 9% and an estimated pooled prevalence of 3.7% (Am J Gastroenterol. 2014;109:1375-89). Risk factors for postcolonoscopy colorectal cancers include low adenoma detection rates, rural facilities, and care by physicians who are not gastroenterologists. However, tumor-specific and patient-specific factors, including location within the colon and superior survival, compared with detected cancers, raises the possibility of underlying molecular differences related to tumorigenesis, the researchers said.

To investigate this idea, they retrospectively analyzed data from residents of Utah between 50 and 80 years old who had a colonoscopy between, Feb. 15, 1995, and Jan. 31, 2009, at one of two large clinical facilities in Utah (Intermountain Healthcare or the University of Utah Health Sciences). Using a state population-based database, they merged medical information from these patients with cancer histories from the Utah Cancer Registry. This enabled them to compare all 84 postcolonoscopy colorectal cancers (defined as those detected within 6-60 months of colonoscopy) with tissue available for analysis with 84 detected colorectal cancers (detected within 6 months of a colonoscopy).

In the multivariable analysis, MSI was the only molecular feature that was significantly more frequent in postcolonoscopy versus detected colorectal cancers (odds ratio, 4.20; 95% confidence interval, 1.58-11.14). However, postcolonoscopy colorectal cancers were significantly more likely to be early stage (86% versus 69% for detected colorectal cancers; P = .040). Five-year survival did not significantly differ between the groups.

“The molecular signatures of postcolonoscopy colorectal cancers in our study overlap with those of sporadic MSI and serrated pathways, suggesting these mechanisms play a disproportionate role in postcolonoscopy colorectal cancers.” the researchers said. “Additional studies are needed to determine whether these postcolonoscopy colorectal cancers arise through a familial cancer pathway and/or serrated neoplastic pathway of sporadic lesions.

Funders included the American College of Gastroenterology, the National Cancer Institute, the Huntsman Cancer Foundation, the University of Utah, and the Utah Department of Health. Dr. Samadder reported consulting relationships with Cancer Prevention Pharmaceuticals and Janssen Research and Development. The other researchers reported having no conflicts of interest.
 

SOURCE: Samadder NJ et al. Clin Gastroenterol Hepatol. 2019 Mar 28. doi: 10.1016/j.cgh.2019.02.040.

Postcolonoscopy colorectal cancers were more likely to arise in the proximal colon and to show microsatellite instability, according to the results of a retrospective population-based study of 168 adults with incident colorectal cancers.

In all, 64% of postcolonoscopy colorectal cancers were located in the proximal colon, compared with 44% of detected colorectal cancers (P = .016), reported Niloy Jewel Samadder, MD, of the University of Utah in Salt Lake City, together with his associates. Furthermore, microsatellite instability (MSI) was detected in 32% of postcolonoscopy colorectal cancers, versus 13% of detected colorectal cancers (P = .005). These findings may point to differences in the underlying biology of postcolonoscopy colorectal cancers and detected colorectal cancers, they said. Studies are needed “to determine if postcolonoscopy cancers arise through a specific genetic pathway that may accelerate neoplastic progression,” they wrote in Clinical Gastroenterology and Hepatology.

Postcolonoscopy colorectal cancers are a “small but clinically important subset of colorectal cancers” that are diagnosed after the patient has a colonoscopy in which no cancer is detected, the researchers noted. These cancers have an estimated global prevalence ranging from 3% to 9% and an estimated pooled prevalence of 3.7% (Am J Gastroenterol. 2014;109:1375-89). Risk factors for postcolonoscopy colorectal cancers include low adenoma detection rates, rural facilities, and care by physicians who are not gastroenterologists. However, tumor-specific and patient-specific factors, including location within the colon and superior survival, compared with detected cancers, raises the possibility of underlying molecular differences related to tumorigenesis, the researchers said.

To investigate this idea, they retrospectively analyzed data from residents of Utah between 50 and 80 years old who had a colonoscopy between, Feb. 15, 1995, and Jan. 31, 2009, at one of two large clinical facilities in Utah (Intermountain Healthcare or the University of Utah Health Sciences). Using a state population-based database, they merged medical information from these patients with cancer histories from the Utah Cancer Registry. This enabled them to compare all 84 postcolonoscopy colorectal cancers (defined as those detected within 6-60 months of colonoscopy) with tissue available for analysis with 84 detected colorectal cancers (detected within 6 months of a colonoscopy).

In the multivariable analysis, MSI was the only molecular feature that was significantly more frequent in postcolonoscopy versus detected colorectal cancers (odds ratio, 4.20; 95% confidence interval, 1.58-11.14). However, postcolonoscopy colorectal cancers were significantly more likely to be early stage (86% versus 69% for detected colorectal cancers; P = .040). Five-year survival did not significantly differ between the groups.

“The molecular signatures of postcolonoscopy colorectal cancers in our study overlap with those of sporadic MSI and serrated pathways, suggesting these mechanisms play a disproportionate role in postcolonoscopy colorectal cancers.” the researchers said. “Additional studies are needed to determine whether these postcolonoscopy colorectal cancers arise through a familial cancer pathway and/or serrated neoplastic pathway of sporadic lesions.

Funders included the American College of Gastroenterology, the National Cancer Institute, the Huntsman Cancer Foundation, the University of Utah, and the Utah Department of Health. Dr. Samadder reported consulting relationships with Cancer Prevention Pharmaceuticals and Janssen Research and Development. The other researchers reported having no conflicts of interest.
 

SOURCE: Samadder NJ et al. Clin Gastroenterol Hepatol. 2019 Mar 28. doi: 10.1016/j.cgh.2019.02.040.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Level of hepatitis B core–related antigen is risk factor for hepatocellular carcinoma

Article Type
Changed
Tue, 12/03/2019 - 08:31

A high level of hepatitis B core–related antigen (HBcrAg) was a complementary risk factor for hepatocellular carcinoma, according to the results of a retrospective cohort study of more than 2,600 noncirrhotic adults with untreated hepatitis B virus (HBV) infection with a median of 16 years of follow-up.

SOURCE: AMERICAN GASTROENTEROLOGICAL ASSOCIATION

“Patients with an intermediate viral load and high levels of HBcrAg had a risk for hepatocellular carcinoma that did not differ significantly from that of patients with a high viral load. [An] HBcrAg of 10 KU/mL may serve as a novel biomarker for the management of patients with intermediate viral load in our clinical practice,” wrote Tai-Chung Tseng, MD, PhD, of National Taiwan University Hospital in Taipei and associates in Gastroenterology.

Deciding whether to start antiviral therapy is controversial for some patients with HBV infection. Typically, monitoring without treatment is recommended for patients who have both low hepatitis B surface antigen levels (less than 1,000 IU/mL) and low levels of HBV DNA (less than 2,000 IU/mL), and early antiviral therapy is recommended for patients who have high levels of HBV DNA (20,000 IU/mL or more). However, there is no clear evidence that early antiviral therapy benefits patients who have intermediate levels of HBV DNA (2,000-19,999 IU/mL) and are negative for hepatitis B e antigen. Biomarkers for risk-stratifying these patients also are lacking, the researchers noted.

Therefore, they studied a cohort of 2,666 adults who had tested positive for hepatitis B surface antigen and were followed at National Taiwan University Hospital from 1985 through 2000. No patient had cirrhosis at baseline. In all, 209 patients developed hepatocellular carcinoma, yielding an incidence rate of 4.91 cases per 1,000 person-years.

Hepatitis B core–related antigen level remained an independent risk factor for hepatocellular carcinoma after accounting for age, sex, serum alanine aminotransferase (ALT) level, FIB-4 index, hepatitis B e antigen status, hepatitis B genotype (B, C, or undetermined), and HBV DNA level. Compared with patients whose HBcrAg level was less than 10 KU, a level of 10-99 KU/mL was associated with a nearly threefold increase in risk for hepatocellular carcinoma (HR, 2.93; 95% CI, 1.67-4.80), and this risk rose even further as HBcrAg levels increased.

In the subgroup of patients who tested negative for hepatitis B e antigen, had an intermediate HBV DNA load (2,000-19,999 IU/mL), and had a normal baseline ALT level (less than 40 U/L), a high HBcrAg level (10 KU/mL or more) was tied to a nearly fivefold greater risk for hepatocellular carcinoma (HR, 4.89; 95% CI, 2.18-10.93). This approximated the risk that is observed with high viral load (20,000 IU/mL), the researchers noted. In contrast, a low HBcrAg level was associated with a risk similar to that of minimal risk carriers (annual incidence rate, 0.10%; 95% CI, 0.04%-0.24%).

“To the best of our knowledge, this is the first study to report HBcrAg level as an independent viral biomarker to stratify hepatocellular risks in a large number of patients with intermediate viral load,” the researchers commented. Among the study limitations, 412 patients received antiviral therapy during follow-up. “This is a retrospective cohort study including Asian HBV patients with genotype B or C infection,” the investigators added. “It is unclear whether this finding could be extrapolated to populations with other HBV genotype infections. Nonetheless, we had a sound cohort, as several HBsAg-related clinical findings based on our cohort have already been validated by other prospective cohort studies, implying that our data were unlikely to be biased by the study design.”

Funders included National Taiwan University Hospital, the Ministry of Science and Technology, Executive Yuan in Taiwan, and National Health Research Institutes. The researchers reported having no conflicts of interest.

SOURCE: Tseng T-C et al. Gastroenterology. 2019 Aug 27. doi: 10.1053/j.gastro.2019.08.028.

Publications
Topics
Sections

A high level of hepatitis B core–related antigen (HBcrAg) was a complementary risk factor for hepatocellular carcinoma, according to the results of a retrospective cohort study of more than 2,600 noncirrhotic adults with untreated hepatitis B virus (HBV) infection with a median of 16 years of follow-up.

SOURCE: AMERICAN GASTROENTEROLOGICAL ASSOCIATION

“Patients with an intermediate viral load and high levels of HBcrAg had a risk for hepatocellular carcinoma that did not differ significantly from that of patients with a high viral load. [An] HBcrAg of 10 KU/mL may serve as a novel biomarker for the management of patients with intermediate viral load in our clinical practice,” wrote Tai-Chung Tseng, MD, PhD, of National Taiwan University Hospital in Taipei and associates in Gastroenterology.

Deciding whether to start antiviral therapy is controversial for some patients with HBV infection. Typically, monitoring without treatment is recommended for patients who have both low hepatitis B surface antigen levels (less than 1,000 IU/mL) and low levels of HBV DNA (less than 2,000 IU/mL), and early antiviral therapy is recommended for patients who have high levels of HBV DNA (20,000 IU/mL or more). However, there is no clear evidence that early antiviral therapy benefits patients who have intermediate levels of HBV DNA (2,000-19,999 IU/mL) and are negative for hepatitis B e antigen. Biomarkers for risk-stratifying these patients also are lacking, the researchers noted.

Therefore, they studied a cohort of 2,666 adults who had tested positive for hepatitis B surface antigen and were followed at National Taiwan University Hospital from 1985 through 2000. No patient had cirrhosis at baseline. In all, 209 patients developed hepatocellular carcinoma, yielding an incidence rate of 4.91 cases per 1,000 person-years.

Hepatitis B core–related antigen level remained an independent risk factor for hepatocellular carcinoma after accounting for age, sex, serum alanine aminotransferase (ALT) level, FIB-4 index, hepatitis B e antigen status, hepatitis B genotype (B, C, or undetermined), and HBV DNA level. Compared with patients whose HBcrAg level was less than 10 KU, a level of 10-99 KU/mL was associated with a nearly threefold increase in risk for hepatocellular carcinoma (HR, 2.93; 95% CI, 1.67-4.80), and this risk rose even further as HBcrAg levels increased.

In the subgroup of patients who tested negative for hepatitis B e antigen, had an intermediate HBV DNA load (2,000-19,999 IU/mL), and had a normal baseline ALT level (less than 40 U/L), a high HBcrAg level (10 KU/mL or more) was tied to a nearly fivefold greater risk for hepatocellular carcinoma (HR, 4.89; 95% CI, 2.18-10.93). This approximated the risk that is observed with high viral load (20,000 IU/mL), the researchers noted. In contrast, a low HBcrAg level was associated with a risk similar to that of minimal risk carriers (annual incidence rate, 0.10%; 95% CI, 0.04%-0.24%).

“To the best of our knowledge, this is the first study to report HBcrAg level as an independent viral biomarker to stratify hepatocellular risks in a large number of patients with intermediate viral load,” the researchers commented. Among the study limitations, 412 patients received antiviral therapy during follow-up. “This is a retrospective cohort study including Asian HBV patients with genotype B or C infection,” the investigators added. “It is unclear whether this finding could be extrapolated to populations with other HBV genotype infections. Nonetheless, we had a sound cohort, as several HBsAg-related clinical findings based on our cohort have already been validated by other prospective cohort studies, implying that our data were unlikely to be biased by the study design.”

Funders included National Taiwan University Hospital, the Ministry of Science and Technology, Executive Yuan in Taiwan, and National Health Research Institutes. The researchers reported having no conflicts of interest.

SOURCE: Tseng T-C et al. Gastroenterology. 2019 Aug 27. doi: 10.1053/j.gastro.2019.08.028.

A high level of hepatitis B core–related antigen (HBcrAg) was a complementary risk factor for hepatocellular carcinoma, according to the results of a retrospective cohort study of more than 2,600 noncirrhotic adults with untreated hepatitis B virus (HBV) infection with a median of 16 years of follow-up.

SOURCE: AMERICAN GASTROENTEROLOGICAL ASSOCIATION

“Patients with an intermediate viral load and high levels of HBcrAg had a risk for hepatocellular carcinoma that did not differ significantly from that of patients with a high viral load. [An] HBcrAg of 10 KU/mL may serve as a novel biomarker for the management of patients with intermediate viral load in our clinical practice,” wrote Tai-Chung Tseng, MD, PhD, of National Taiwan University Hospital in Taipei and associates in Gastroenterology.

Deciding whether to start antiviral therapy is controversial for some patients with HBV infection. Typically, monitoring without treatment is recommended for patients who have both low hepatitis B surface antigen levels (less than 1,000 IU/mL) and low levels of HBV DNA (less than 2,000 IU/mL), and early antiviral therapy is recommended for patients who have high levels of HBV DNA (20,000 IU/mL or more). However, there is no clear evidence that early antiviral therapy benefits patients who have intermediate levels of HBV DNA (2,000-19,999 IU/mL) and are negative for hepatitis B e antigen. Biomarkers for risk-stratifying these patients also are lacking, the researchers noted.

Therefore, they studied a cohort of 2,666 adults who had tested positive for hepatitis B surface antigen and were followed at National Taiwan University Hospital from 1985 through 2000. No patient had cirrhosis at baseline. In all, 209 patients developed hepatocellular carcinoma, yielding an incidence rate of 4.91 cases per 1,000 person-years.

Hepatitis B core–related antigen level remained an independent risk factor for hepatocellular carcinoma after accounting for age, sex, serum alanine aminotransferase (ALT) level, FIB-4 index, hepatitis B e antigen status, hepatitis B genotype (B, C, or undetermined), and HBV DNA level. Compared with patients whose HBcrAg level was less than 10 KU, a level of 10-99 KU/mL was associated with a nearly threefold increase in risk for hepatocellular carcinoma (HR, 2.93; 95% CI, 1.67-4.80), and this risk rose even further as HBcrAg levels increased.

In the subgroup of patients who tested negative for hepatitis B e antigen, had an intermediate HBV DNA load (2,000-19,999 IU/mL), and had a normal baseline ALT level (less than 40 U/L), a high HBcrAg level (10 KU/mL or more) was tied to a nearly fivefold greater risk for hepatocellular carcinoma (HR, 4.89; 95% CI, 2.18-10.93). This approximated the risk that is observed with high viral load (20,000 IU/mL), the researchers noted. In contrast, a low HBcrAg level was associated with a risk similar to that of minimal risk carriers (annual incidence rate, 0.10%; 95% CI, 0.04%-0.24%).

“To the best of our knowledge, this is the first study to report HBcrAg level as an independent viral biomarker to stratify hepatocellular risks in a large number of patients with intermediate viral load,” the researchers commented. Among the study limitations, 412 patients received antiviral therapy during follow-up. “This is a retrospective cohort study including Asian HBV patients with genotype B or C infection,” the investigators added. “It is unclear whether this finding could be extrapolated to populations with other HBV genotype infections. Nonetheless, we had a sound cohort, as several HBsAg-related clinical findings based on our cohort have already been validated by other prospective cohort studies, implying that our data were unlikely to be biased by the study design.”

Funders included National Taiwan University Hospital, the Ministry of Science and Technology, Executive Yuan in Taiwan, and National Health Research Institutes. The researchers reported having no conflicts of interest.

SOURCE: Tseng T-C et al. Gastroenterology. 2019 Aug 27. doi: 10.1053/j.gastro.2019.08.028.

Publications
Publications
Topics
Article Type
Click for Credit Status
Active
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
CME ID
211943
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Daily aspirin might cut risk of fibrosis progression

Aspirin may be an option in NAFLD
Article Type
Changed
Tue, 11/19/2019 - 10:46

Taking daily aspirin may help keep nonalcoholic fatty liver disease (NAFLD) from progressing to liver fibrosis and nonalcoholic steatohepatitis (NASH), suggest the results of a prospective study of 361 adults.

Previously, preclinical evidence had linked aspirin to fibrogenesis prevention in fatty liver disease, but this is the first report of a prospective study to do so. Daily aspirin use “was associated with less severe histologic features of NAFLD (nonalcoholic fatty liver disease) at study enrollment and with significantly lower risk for advanced fibrosis over time in a duration-dependent manner,” wrote Tracey G. Simon, MD, MPH, and her associates. Their report is in Clinical Gastroenterology and Hepatology.

The study comprised 361 adults with biopsy-confirmed NAFLD who were enrolled in the Massachusetts General Hospital NAFLD Repository between 2006 and 2015. At baseline, 151 individuals were already on daily aspirin, usually to reduce the primary (54%) or secondary (30%) risk of cardiovascular disease. Median duration of aspirin use was 2.5 years. After a median 7.4 years of follow-up (which was similar between aspirin users and nonusers), daily aspirin use was associated with significantly lower odds of NASH (adjusted odds ratio, 0.68; 95% confidence interval, 0.37-0.89) and fibrosis (aOR, 0.54; 95% CI, 0.31-0.82).

The researchers did not find a similar protective effect for nonsteroidal anti-inflammatory drugs (NSAIDs) other than aspirin (adjusted hazard ratio for advanced fibrosis, 0.93; 95% CI, 0.81–1.05). This might be because of differences between how aspirin and nonaspirin NSAIDs affect COX isoforms – aspirin does so irreversibly, while other NSAIDs have a reversible effect, they added. “Nonaspirin NSAIDs also disrupt the intestinal barrier, increasing delivery of proinflammatory cytokines to the liver,” they wrote. “Finally, aspirin uniquely modulates bioactive lipids by stimulating the biosynthesis of pro-resolving mediators and inhibiting proinflammatory lipids, which in turn may prevent progressive liver damage.”

In this study, a single blinded hepatopathologist interpreted baseline liver biopsy specimens, and patients were followed every 3-6 months with clinical examinations and serial calculations of FIB-4, NFS, and APRI scores. All patients were followed for at least a year. Patients were classified as users of nonaspirin NSAIDs if they reported using an NSAID besides aspirin at least twice weekly, or if they had been prescribed drugs such as ibuprofen, naproxen, ketoprofen, diclofenac, or indomethacin.

In a longitudinal analysis of the 317 patients who had early-stage (F0-2) fibrosis at baseline, 86 developed new-onset advanced fibrosis over a median of 3,692 person-years, the researchers said. In all, 26 individuals developed hepatic decompensation and 18 patients died, including eight from liver-related causes. Importantly, the link between aspirin and decreased risk of fibrosis progression seemed to depend on duration of use (adjusted P trend = .026), with the greatest benefit seen with 4 years or more of use (aHR, 0.50; 95% CI, 0.35-0.73). Although subgroup analyses were limited by lack of power, daily aspirin use was associated with a 36% lower odds of incident advanced fibrosis among the 72 study participants who had paired biopsy samples, even after accounting for the effect of age, sex, baseline fibrosis stage, and time between biopsies (aOR, 0.64; 95% CI, 0.50-0.80).

“Our findings add to the growing literature supporting the potential hepatoprotective effects of aspirin in NAFLD,” the researchers concluded. “Research to uncover the mechanisms by which aspirin might prevent fibrogenesis could help develop urgently needed antifibrotic therapies for NAFLD.”

Funders included the National Institutes of Health and the AASLD Foundation. The investigators reported having no conflicts of interest.

SOURCE: Simon TG et al. Clin Gastroenterol Hepatol. 2019 May 8.

Body

Slowing, preventing, or reversing fibrogenesis in patients with NAFLD remains an unmet need. Lifestyle interventions are beneficial to this population but challenging because of concerns with adherence and sustainability, thus, favoring pharmacologic interventions.

The study by Simon et al. provides initial prospective evidence of the role of aspirin in reducing progression of fibrosis. In a thoughtful design, authors showed both cross-sectional and longitudinal associations of reduced risk for progressed fibrosis among aspirin users, all with biological coherence and while accounting for various confounding factors. Although the accuracy of blood-based noninvasive assessment of liver fibrosis (by FIB-4, NFS, and APRI) to determine progression of fibrosis in NAFLD has moderate accuracy at its best, the relatively high FIB-4 cutoff value used by the authors and their sensitivity analyses (including liver biopsy and combinations of blood-based markers combined endpoints) bring certainty to their results.

However, before we can start prescribing aspirin to halt progression of fibrosis in NAFLD, larger and adequately powered studies are needed. Caution with the use of aspirin as prophylaxis for atherosclerotic cardiovascular disease (ASCVD) is now advised, based on results from large clinical trials (i.e., ASCEND). NAFLD patients represent a particular population with both a high ASCVD risk and a high risk for gastrointestinal bleeding, and it is unclear what the number needed to treat or to harm would be without confirmatory studies.

Dr. Andres Duarte-Rojo
An “NAFLD polypill” including a combination of drugs addressing multiple metabolic pathways (e.g. aspirin, a statin, and vitamin E) might well tip the scale in favor of improved clinical outcomes, a concept recently shown as beneficial for ASCVD prevention in the PolyIran study. Until then, properly weighing the use of prophylactic aspirin in patients with NAFLD and adhering to standard recommendations is advised.

Andres Duarte-Rojo, MD, PhD, is associate professor of medicine, division of gastroenterology, hepatology, and nutrition at the University of Pittsburgh Medical Center, and Pittsburgh Liver Research Center. He received research support from Echosens, USA.

Publications
Topics
Sections
Body

Slowing, preventing, or reversing fibrogenesis in patients with NAFLD remains an unmet need. Lifestyle interventions are beneficial to this population but challenging because of concerns with adherence and sustainability, thus, favoring pharmacologic interventions.

The study by Simon et al. provides initial prospective evidence of the role of aspirin in reducing progression of fibrosis. In a thoughtful design, authors showed both cross-sectional and longitudinal associations of reduced risk for progressed fibrosis among aspirin users, all with biological coherence and while accounting for various confounding factors. Although the accuracy of blood-based noninvasive assessment of liver fibrosis (by FIB-4, NFS, and APRI) to determine progression of fibrosis in NAFLD has moderate accuracy at its best, the relatively high FIB-4 cutoff value used by the authors and their sensitivity analyses (including liver biopsy and combinations of blood-based markers combined endpoints) bring certainty to their results.

However, before we can start prescribing aspirin to halt progression of fibrosis in NAFLD, larger and adequately powered studies are needed. Caution with the use of aspirin as prophylaxis for atherosclerotic cardiovascular disease (ASCVD) is now advised, based on results from large clinical trials (i.e., ASCEND). NAFLD patients represent a particular population with both a high ASCVD risk and a high risk for gastrointestinal bleeding, and it is unclear what the number needed to treat or to harm would be without confirmatory studies.

Dr. Andres Duarte-Rojo
An “NAFLD polypill” including a combination of drugs addressing multiple metabolic pathways (e.g. aspirin, a statin, and vitamin E) might well tip the scale in favor of improved clinical outcomes, a concept recently shown as beneficial for ASCVD prevention in the PolyIran study. Until then, properly weighing the use of prophylactic aspirin in patients with NAFLD and adhering to standard recommendations is advised.

Andres Duarte-Rojo, MD, PhD, is associate professor of medicine, division of gastroenterology, hepatology, and nutrition at the University of Pittsburgh Medical Center, and Pittsburgh Liver Research Center. He received research support from Echosens, USA.

Body

Slowing, preventing, or reversing fibrogenesis in patients with NAFLD remains an unmet need. Lifestyle interventions are beneficial to this population but challenging because of concerns with adherence and sustainability, thus, favoring pharmacologic interventions.

The study by Simon et al. provides initial prospective evidence of the role of aspirin in reducing progression of fibrosis. In a thoughtful design, authors showed both cross-sectional and longitudinal associations of reduced risk for progressed fibrosis among aspirin users, all with biological coherence and while accounting for various confounding factors. Although the accuracy of blood-based noninvasive assessment of liver fibrosis (by FIB-4, NFS, and APRI) to determine progression of fibrosis in NAFLD has moderate accuracy at its best, the relatively high FIB-4 cutoff value used by the authors and their sensitivity analyses (including liver biopsy and combinations of blood-based markers combined endpoints) bring certainty to their results.

However, before we can start prescribing aspirin to halt progression of fibrosis in NAFLD, larger and adequately powered studies are needed. Caution with the use of aspirin as prophylaxis for atherosclerotic cardiovascular disease (ASCVD) is now advised, based on results from large clinical trials (i.e., ASCEND). NAFLD patients represent a particular population with both a high ASCVD risk and a high risk for gastrointestinal bleeding, and it is unclear what the number needed to treat or to harm would be without confirmatory studies.

Dr. Andres Duarte-Rojo
An “NAFLD polypill” including a combination of drugs addressing multiple metabolic pathways (e.g. aspirin, a statin, and vitamin E) might well tip the scale in favor of improved clinical outcomes, a concept recently shown as beneficial for ASCVD prevention in the PolyIran study. Until then, properly weighing the use of prophylactic aspirin in patients with NAFLD and adhering to standard recommendations is advised.

Andres Duarte-Rojo, MD, PhD, is associate professor of medicine, division of gastroenterology, hepatology, and nutrition at the University of Pittsburgh Medical Center, and Pittsburgh Liver Research Center. He received research support from Echosens, USA.

Title
Aspirin may be an option in NAFLD
Aspirin may be an option in NAFLD

Taking daily aspirin may help keep nonalcoholic fatty liver disease (NAFLD) from progressing to liver fibrosis and nonalcoholic steatohepatitis (NASH), suggest the results of a prospective study of 361 adults.

Previously, preclinical evidence had linked aspirin to fibrogenesis prevention in fatty liver disease, but this is the first report of a prospective study to do so. Daily aspirin use “was associated with less severe histologic features of NAFLD (nonalcoholic fatty liver disease) at study enrollment and with significantly lower risk for advanced fibrosis over time in a duration-dependent manner,” wrote Tracey G. Simon, MD, MPH, and her associates. Their report is in Clinical Gastroenterology and Hepatology.

The study comprised 361 adults with biopsy-confirmed NAFLD who were enrolled in the Massachusetts General Hospital NAFLD Repository between 2006 and 2015. At baseline, 151 individuals were already on daily aspirin, usually to reduce the primary (54%) or secondary (30%) risk of cardiovascular disease. Median duration of aspirin use was 2.5 years. After a median 7.4 years of follow-up (which was similar between aspirin users and nonusers), daily aspirin use was associated with significantly lower odds of NASH (adjusted odds ratio, 0.68; 95% confidence interval, 0.37-0.89) and fibrosis (aOR, 0.54; 95% CI, 0.31-0.82).

The researchers did not find a similar protective effect for nonsteroidal anti-inflammatory drugs (NSAIDs) other than aspirin (adjusted hazard ratio for advanced fibrosis, 0.93; 95% CI, 0.81–1.05). This might be because of differences between how aspirin and nonaspirin NSAIDs affect COX isoforms – aspirin does so irreversibly, while other NSAIDs have a reversible effect, they added. “Nonaspirin NSAIDs also disrupt the intestinal barrier, increasing delivery of proinflammatory cytokines to the liver,” they wrote. “Finally, aspirin uniquely modulates bioactive lipids by stimulating the biosynthesis of pro-resolving mediators and inhibiting proinflammatory lipids, which in turn may prevent progressive liver damage.”

In this study, a single blinded hepatopathologist interpreted baseline liver biopsy specimens, and patients were followed every 3-6 months with clinical examinations and serial calculations of FIB-4, NFS, and APRI scores. All patients were followed for at least a year. Patients were classified as users of nonaspirin NSAIDs if they reported using an NSAID besides aspirin at least twice weekly, or if they had been prescribed drugs such as ibuprofen, naproxen, ketoprofen, diclofenac, or indomethacin.

In a longitudinal analysis of the 317 patients who had early-stage (F0-2) fibrosis at baseline, 86 developed new-onset advanced fibrosis over a median of 3,692 person-years, the researchers said. In all, 26 individuals developed hepatic decompensation and 18 patients died, including eight from liver-related causes. Importantly, the link between aspirin and decreased risk of fibrosis progression seemed to depend on duration of use (adjusted P trend = .026), with the greatest benefit seen with 4 years or more of use (aHR, 0.50; 95% CI, 0.35-0.73). Although subgroup analyses were limited by lack of power, daily aspirin use was associated with a 36% lower odds of incident advanced fibrosis among the 72 study participants who had paired biopsy samples, even after accounting for the effect of age, sex, baseline fibrosis stage, and time between biopsies (aOR, 0.64; 95% CI, 0.50-0.80).

“Our findings add to the growing literature supporting the potential hepatoprotective effects of aspirin in NAFLD,” the researchers concluded. “Research to uncover the mechanisms by which aspirin might prevent fibrogenesis could help develop urgently needed antifibrotic therapies for NAFLD.”

Funders included the National Institutes of Health and the AASLD Foundation. The investigators reported having no conflicts of interest.

SOURCE: Simon TG et al. Clin Gastroenterol Hepatol. 2019 May 8.

Taking daily aspirin may help keep nonalcoholic fatty liver disease (NAFLD) from progressing to liver fibrosis and nonalcoholic steatohepatitis (NASH), suggest the results of a prospective study of 361 adults.

Previously, preclinical evidence had linked aspirin to fibrogenesis prevention in fatty liver disease, but this is the first report of a prospective study to do so. Daily aspirin use “was associated with less severe histologic features of NAFLD (nonalcoholic fatty liver disease) at study enrollment and with significantly lower risk for advanced fibrosis over time in a duration-dependent manner,” wrote Tracey G. Simon, MD, MPH, and her associates. Their report is in Clinical Gastroenterology and Hepatology.

The study comprised 361 adults with biopsy-confirmed NAFLD who were enrolled in the Massachusetts General Hospital NAFLD Repository between 2006 and 2015. At baseline, 151 individuals were already on daily aspirin, usually to reduce the primary (54%) or secondary (30%) risk of cardiovascular disease. Median duration of aspirin use was 2.5 years. After a median 7.4 years of follow-up (which was similar between aspirin users and nonusers), daily aspirin use was associated with significantly lower odds of NASH (adjusted odds ratio, 0.68; 95% confidence interval, 0.37-0.89) and fibrosis (aOR, 0.54; 95% CI, 0.31-0.82).

The researchers did not find a similar protective effect for nonsteroidal anti-inflammatory drugs (NSAIDs) other than aspirin (adjusted hazard ratio for advanced fibrosis, 0.93; 95% CI, 0.81–1.05). This might be because of differences between how aspirin and nonaspirin NSAIDs affect COX isoforms – aspirin does so irreversibly, while other NSAIDs have a reversible effect, they added. “Nonaspirin NSAIDs also disrupt the intestinal barrier, increasing delivery of proinflammatory cytokines to the liver,” they wrote. “Finally, aspirin uniquely modulates bioactive lipids by stimulating the biosynthesis of pro-resolving mediators and inhibiting proinflammatory lipids, which in turn may prevent progressive liver damage.”

In this study, a single blinded hepatopathologist interpreted baseline liver biopsy specimens, and patients were followed every 3-6 months with clinical examinations and serial calculations of FIB-4, NFS, and APRI scores. All patients were followed for at least a year. Patients were classified as users of nonaspirin NSAIDs if they reported using an NSAID besides aspirin at least twice weekly, or if they had been prescribed drugs such as ibuprofen, naproxen, ketoprofen, diclofenac, or indomethacin.

In a longitudinal analysis of the 317 patients who had early-stage (F0-2) fibrosis at baseline, 86 developed new-onset advanced fibrosis over a median of 3,692 person-years, the researchers said. In all, 26 individuals developed hepatic decompensation and 18 patients died, including eight from liver-related causes. Importantly, the link between aspirin and decreased risk of fibrosis progression seemed to depend on duration of use (adjusted P trend = .026), with the greatest benefit seen with 4 years or more of use (aHR, 0.50; 95% CI, 0.35-0.73). Although subgroup analyses were limited by lack of power, daily aspirin use was associated with a 36% lower odds of incident advanced fibrosis among the 72 study participants who had paired biopsy samples, even after accounting for the effect of age, sex, baseline fibrosis stage, and time between biopsies (aOR, 0.64; 95% CI, 0.50-0.80).

“Our findings add to the growing literature supporting the potential hepatoprotective effects of aspirin in NAFLD,” the researchers concluded. “Research to uncover the mechanisms by which aspirin might prevent fibrogenesis could help develop urgently needed antifibrotic therapies for NAFLD.”

Funders included the National Institutes of Health and the AASLD Foundation. The investigators reported having no conflicts of interest.

SOURCE: Simon TG et al. Clin Gastroenterol Hepatol. 2019 May 8.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Threshold for positivity affects FIT sensitivity for detecting CRC, advanced adenomas

Many screening programs could lower the positivity threshold
Article Type
Changed
Tue, 11/19/2019 - 10:42

 

Thresholds for positivity affected the sensitivity and (to a lesser extent) the specificity of quantitative fecal immunochemical tests used in the detection of colorectal cancer, which suggests that centers should consider lowering their thresholds for positivity if they have sufficient resources to handle an increase in follow-up colonoscopies, researchers wrote in Gastroenterology.

“Additional data are needed regarding the influence of sex and age on test performance,” wrote Kevin Selby, MD, of Kaiser Permanente Division of Research in Oakland, Calif., together with his associates. Additional studies also should evaluate the effect of a quantitative threshold of 10 mcg of hemoglobin per gram of feces and multiple rounds of annual testing, they added.

Fecal immunochemical tests (FITs) are recommended for colorectal cancer screening because they are diagnostically superior and are associated with higher participation rates, compared with guaiac fecal occult blood tests, the investigators noted. For screening, the optimal positivity threshold for quantitative FIT remains controversial, is likely to vary by sex and age, and also may be adjusted to reflect local health care resources. To more closely evaluate the correlates and effects of FIT cutoffs for sensitivity, the researchers searched MEDLINE, EMBASE, and the Database of Abstracts of Reviews of Effects for articles on the use of FIT for asymptomatic (screening) colorectal cancer detection in adults. This method identified 46 studies with 2.4 million participants and 6,478 detected cancers. The researchers then calculated sensitivity, specificity, numbers of detected cancers, advanced adenomas, and positive test results at positivity thresholds of up to 10 mcg, 10-20 mcg, 20-30 mcg, and more than 30 mcg of hemoglobin per gram of feces. They also examined subgroups stratified by sex and age.

The pooled sensitivity for the detection of colorectal cancer rose from 69% (95% confidence interval, 63%-75%) at a positivity threshold of more than 10 and up to 20 mcg of hemoglobin per gram of feces, to 80% at a positivity threshold of 10 mcg or less of hemoglobin per gram of feces. “At these [same] threshold values, sensitivity for detection of advanced adenomas increased from 21% (95% CI, 18%-25%) to 31% (95% CI, 27%-35%), whereas specificity decreased from 94% (95% CI, 93%-96%) to 91% (95% CI, 89%-93%),” the researchers wrote.

Only three studies stratified results by sex, and these found no statistical difference in pooled sensitivity for detecting colorectal cancer among men (77%) versus women (81%). Age, too, was stratified in only three studies and did not significantly correlate with sensitivity. “More research is needed to precisely establish FIT thresholds for each sex and age subgroup,” the researchers said.

The National Cancer Institute and the Swiss Cancer Research Foundation provided funding. The investigators reported having no conflicts of interest.

SOURCE: Selby K et al. Gastroenterology. 2019 Aug 22. doi: 10.1053/j.gastro.2019.08.023.

Body

 

Quantitative fecal immunochemical tests or FITs are the most recent incarnation of screening for colorectal cancer (CRC) through the identification of occult blood in stool. Older versions of such tests were the first screening modalities shown to decrease both the incidence and mortality of CRC. FITs are much more sensitive for both CRC and advanced adenomas than are those early occult blood tests. They also are among the least costly and most easily employed CRC screening modalities. Given the quantitative nature of FITs, the question has remained as to what positivity threshold should be employed to achieve the optimal balance of sensitivity and specificity.

Dr. Reid M. Ness
The current study by Selby et al. examined data from 46 studies and 2.4 million participants from 12 countries. The authors found that by lowering the positivity threshold to less than 10 mcg/g from greater than 10 mcg/g but less than 20 mcg/g, the sensitivity for CRC increased from 69% to 80% and for advanced adenomas from 21% to 31%, with a trivial fall in specificity from 94% to 91%. They also found that neither sex nor age significantly altered these outcomes in the minority of studies that stratified by these demographics. These outcomes suggest that screening programs should lower the positivity threshold for FITs to less than 10 mcg/g from the current less than 20 mcg/g recommended by the U.S. Multi-Society Task Force on Colorectal Cancer Screening.

Future studies should examine more carefully demographic effects on FIT performance to determine if different positivity thresholds need to be employed in different demographic groups.
 

Reid M. Ness, MD, MPH, is an associate professor in the division of gastroenterology, hepatology and nutrition, department of medicine, Vanderbilt University Medical Center and at the Veterans Affairs Tennessee Valley Healthcare System, Nashville campus. He is also an investigator in the Vanderbilt-Ingram Cancer Center. Dr. Ness has no financial relationships to disclose.

Publications
Topics
Sections
Body

 

Quantitative fecal immunochemical tests or FITs are the most recent incarnation of screening for colorectal cancer (CRC) through the identification of occult blood in stool. Older versions of such tests were the first screening modalities shown to decrease both the incidence and mortality of CRC. FITs are much more sensitive for both CRC and advanced adenomas than are those early occult blood tests. They also are among the least costly and most easily employed CRC screening modalities. Given the quantitative nature of FITs, the question has remained as to what positivity threshold should be employed to achieve the optimal balance of sensitivity and specificity.

Dr. Reid M. Ness
The current study by Selby et al. examined data from 46 studies and 2.4 million participants from 12 countries. The authors found that by lowering the positivity threshold to less than 10 mcg/g from greater than 10 mcg/g but less than 20 mcg/g, the sensitivity for CRC increased from 69% to 80% and for advanced adenomas from 21% to 31%, with a trivial fall in specificity from 94% to 91%. They also found that neither sex nor age significantly altered these outcomes in the minority of studies that stratified by these demographics. These outcomes suggest that screening programs should lower the positivity threshold for FITs to less than 10 mcg/g from the current less than 20 mcg/g recommended by the U.S. Multi-Society Task Force on Colorectal Cancer Screening.

Future studies should examine more carefully demographic effects on FIT performance to determine if different positivity thresholds need to be employed in different demographic groups.
 

Reid M. Ness, MD, MPH, is an associate professor in the division of gastroenterology, hepatology and nutrition, department of medicine, Vanderbilt University Medical Center and at the Veterans Affairs Tennessee Valley Healthcare System, Nashville campus. He is also an investigator in the Vanderbilt-Ingram Cancer Center. Dr. Ness has no financial relationships to disclose.

Body

 

Quantitative fecal immunochemical tests or FITs are the most recent incarnation of screening for colorectal cancer (CRC) through the identification of occult blood in stool. Older versions of such tests were the first screening modalities shown to decrease both the incidence and mortality of CRC. FITs are much more sensitive for both CRC and advanced adenomas than are those early occult blood tests. They also are among the least costly and most easily employed CRC screening modalities. Given the quantitative nature of FITs, the question has remained as to what positivity threshold should be employed to achieve the optimal balance of sensitivity and specificity.

Dr. Reid M. Ness
The current study by Selby et al. examined data from 46 studies and 2.4 million participants from 12 countries. The authors found that by lowering the positivity threshold to less than 10 mcg/g from greater than 10 mcg/g but less than 20 mcg/g, the sensitivity for CRC increased from 69% to 80% and for advanced adenomas from 21% to 31%, with a trivial fall in specificity from 94% to 91%. They also found that neither sex nor age significantly altered these outcomes in the minority of studies that stratified by these demographics. These outcomes suggest that screening programs should lower the positivity threshold for FITs to less than 10 mcg/g from the current less than 20 mcg/g recommended by the U.S. Multi-Society Task Force on Colorectal Cancer Screening.

Future studies should examine more carefully demographic effects on FIT performance to determine if different positivity thresholds need to be employed in different demographic groups.
 

Reid M. Ness, MD, MPH, is an associate professor in the division of gastroenterology, hepatology and nutrition, department of medicine, Vanderbilt University Medical Center and at the Veterans Affairs Tennessee Valley Healthcare System, Nashville campus. He is also an investigator in the Vanderbilt-Ingram Cancer Center. Dr. Ness has no financial relationships to disclose.

Title
Many screening programs could lower the positivity threshold
Many screening programs could lower the positivity threshold

 

Thresholds for positivity affected the sensitivity and (to a lesser extent) the specificity of quantitative fecal immunochemical tests used in the detection of colorectal cancer, which suggests that centers should consider lowering their thresholds for positivity if they have sufficient resources to handle an increase in follow-up colonoscopies, researchers wrote in Gastroenterology.

“Additional data are needed regarding the influence of sex and age on test performance,” wrote Kevin Selby, MD, of Kaiser Permanente Division of Research in Oakland, Calif., together with his associates. Additional studies also should evaluate the effect of a quantitative threshold of 10 mcg of hemoglobin per gram of feces and multiple rounds of annual testing, they added.

Fecal immunochemical tests (FITs) are recommended for colorectal cancer screening because they are diagnostically superior and are associated with higher participation rates, compared with guaiac fecal occult blood tests, the investigators noted. For screening, the optimal positivity threshold for quantitative FIT remains controversial, is likely to vary by sex and age, and also may be adjusted to reflect local health care resources. To more closely evaluate the correlates and effects of FIT cutoffs for sensitivity, the researchers searched MEDLINE, EMBASE, and the Database of Abstracts of Reviews of Effects for articles on the use of FIT for asymptomatic (screening) colorectal cancer detection in adults. This method identified 46 studies with 2.4 million participants and 6,478 detected cancers. The researchers then calculated sensitivity, specificity, numbers of detected cancers, advanced adenomas, and positive test results at positivity thresholds of up to 10 mcg, 10-20 mcg, 20-30 mcg, and more than 30 mcg of hemoglobin per gram of feces. They also examined subgroups stratified by sex and age.

The pooled sensitivity for the detection of colorectal cancer rose from 69% (95% confidence interval, 63%-75%) at a positivity threshold of more than 10 and up to 20 mcg of hemoglobin per gram of feces, to 80% at a positivity threshold of 10 mcg or less of hemoglobin per gram of feces. “At these [same] threshold values, sensitivity for detection of advanced adenomas increased from 21% (95% CI, 18%-25%) to 31% (95% CI, 27%-35%), whereas specificity decreased from 94% (95% CI, 93%-96%) to 91% (95% CI, 89%-93%),” the researchers wrote.

Only three studies stratified results by sex, and these found no statistical difference in pooled sensitivity for detecting colorectal cancer among men (77%) versus women (81%). Age, too, was stratified in only three studies and did not significantly correlate with sensitivity. “More research is needed to precisely establish FIT thresholds for each sex and age subgroup,” the researchers said.

The National Cancer Institute and the Swiss Cancer Research Foundation provided funding. The investigators reported having no conflicts of interest.

SOURCE: Selby K et al. Gastroenterology. 2019 Aug 22. doi: 10.1053/j.gastro.2019.08.023.

 

Thresholds for positivity affected the sensitivity and (to a lesser extent) the specificity of quantitative fecal immunochemical tests used in the detection of colorectal cancer, which suggests that centers should consider lowering their thresholds for positivity if they have sufficient resources to handle an increase in follow-up colonoscopies, researchers wrote in Gastroenterology.

“Additional data are needed regarding the influence of sex and age on test performance,” wrote Kevin Selby, MD, of Kaiser Permanente Division of Research in Oakland, Calif., together with his associates. Additional studies also should evaluate the effect of a quantitative threshold of 10 mcg of hemoglobin per gram of feces and multiple rounds of annual testing, they added.

Fecal immunochemical tests (FITs) are recommended for colorectal cancer screening because they are diagnostically superior and are associated with higher participation rates, compared with guaiac fecal occult blood tests, the investigators noted. For screening, the optimal positivity threshold for quantitative FIT remains controversial, is likely to vary by sex and age, and also may be adjusted to reflect local health care resources. To more closely evaluate the correlates and effects of FIT cutoffs for sensitivity, the researchers searched MEDLINE, EMBASE, and the Database of Abstracts of Reviews of Effects for articles on the use of FIT for asymptomatic (screening) colorectal cancer detection in adults. This method identified 46 studies with 2.4 million participants and 6,478 detected cancers. The researchers then calculated sensitivity, specificity, numbers of detected cancers, advanced adenomas, and positive test results at positivity thresholds of up to 10 mcg, 10-20 mcg, 20-30 mcg, and more than 30 mcg of hemoglobin per gram of feces. They also examined subgroups stratified by sex and age.

The pooled sensitivity for the detection of colorectal cancer rose from 69% (95% confidence interval, 63%-75%) at a positivity threshold of more than 10 and up to 20 mcg of hemoglobin per gram of feces, to 80% at a positivity threshold of 10 mcg or less of hemoglobin per gram of feces. “At these [same] threshold values, sensitivity for detection of advanced adenomas increased from 21% (95% CI, 18%-25%) to 31% (95% CI, 27%-35%), whereas specificity decreased from 94% (95% CI, 93%-96%) to 91% (95% CI, 89%-93%),” the researchers wrote.

Only three studies stratified results by sex, and these found no statistical difference in pooled sensitivity for detecting colorectal cancer among men (77%) versus women (81%). Age, too, was stratified in only three studies and did not significantly correlate with sensitivity. “More research is needed to precisely establish FIT thresholds for each sex and age subgroup,” the researchers said.

The National Cancer Institute and the Swiss Cancer Research Foundation provided funding. The investigators reported having no conflicts of interest.

SOURCE: Selby K et al. Gastroenterology. 2019 Aug 22. doi: 10.1053/j.gastro.2019.08.023.

Publications
Publications
Topics
Article Type
Click for Credit Status
Active
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
CME ID
211657
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

AGA Clinical Practice Update: Coagulation in cirrhosis

Article Type
Changed
Mon, 07/01/2019 - 10:33

Cirrhosis can involve “precarious” changes in hemostatic pathways that tip the scales toward either bleeding or hypercoagulation, experts wrote in an American Gastroenterological Association Clinical Practice Update.

Based on current evidence, clinicians should not routinely correct thrombocytopenia and coagulopathy in patients with cirrhosis prior to low-risk procedures, such as therapeutic paracentesis, thoracentesis, and routine upper endoscopy for variceal ligation, Jacqueline G. O’Leary, MD, of Dallas VA Medical Center and her three coreviewers wrote in Gastroenterology.

To optimize clot formation prior to high-risk procedures, and in patients with active bleeding, a platelet count above 50,000 per mcL is still recommended. However, it may be more meaningful to couple that platelet target with a fibrinogen level above 120 mg/dL rather than rely on the international normalized ratio (INR), the experts wrote. Not only does INR vary significantly depending on which thromboplastin is used in the test, but “correcting” INR with a fresh frozen plasma infusion does not affect thrombin production and worsens portal hypertension. Using cryoprecipitate to replenish fibrinogen has less impact on portal hypertension. “Global tests of clot formation, such as rotational thromboelastometry (ROTEM), thromboelastography (TEG), sonorheometry, and thrombin generation may eventually have a role in the evaluation of clotting in patients with cirrhosis but currently lack validated target levels,” the experts wrote.

They advised clinicians to limit the use of blood products (such as fresh frozen plasma and pooled platelet transfusions) because of cost and the risk of exacerbated portal hypertension, infection, and immunologic complications. For severe anemia and uremia, red blood cell transfusion (250 mL) can be considered. Platelet-rich plasma from one donor is less immunologically risky than a pooled platelet transfusion. Thrombopoietin agonists are “a good alternative” to platelet transfusion but require about 10 days for response. Alternative prothrombotic therapies include oral thrombopoietin receptor agonists (avatrombopag and lusutrombopag) to boost platelet count before an invasive procedure, antifibrinolytic therapy (aminocaproic acid and tranexamic acid) for persistent bleeding from mucosal oozing or puncture wounds. Desmopressin should only be considered for patients with comorbid renal failure.

For anticoagulation, the practice update recommends considering systemic heparin infusion for cirrhotic patients with symptomatic deep venous thrombosis (DVT) or portal vein thrombosis (PVT). However, the anti–factor Xa assay will not reliably monitor response if patients have low liver-derived antithrombin III (heparin cofactor). “With newly diagnosed PVT, the decision to intervene with directed therapy rests on the extent of the thrombosis, presence or absence of attributable symptoms, and the risk of bleeding and falls,” the experts stated.

Six-month follow-up imaging is recommended to assess anticoagulation efficacy. More frequent imaging can be considered for PVT patients considered at high risk for therapeutic anticoagulation. If clots do not fully resolve after 6 months of treatment, options including extending therapy with the same agent, switching to a different anticoagulant class, or receiving transjugular intrahepatic portosystemic shunt (TIPS). “The role for TIPS in PVT is evolving and may address complications like portal hypertensive bleeding, medically refractory clot, and the need for repeated banding after variceal bleeding,” the experts noted.

Prophylaxis of DVT is recommended for all hospitalized patients with cirrhosis. Vitamin K antagonists and direct-acting oral anticoagulants (dabigatran, apixaban, rivaroxaban, and edoxaban) are alternatives to heparin for anticoagulation of cirrhotic patients with either PVT and DVT, the experts wrote. However, DOACs are not recommended for most Child-Pugh B patients or for any Child-Pugh C patients.

No funding sources or conflicts of interest were reported.

SOURCE: O’Leary JG et al. Gastroenterology. 2019. doi: 10.1053/j.gastro.2019.03.070.

Publications
Topics
Sections

Cirrhosis can involve “precarious” changes in hemostatic pathways that tip the scales toward either bleeding or hypercoagulation, experts wrote in an American Gastroenterological Association Clinical Practice Update.

Based on current evidence, clinicians should not routinely correct thrombocytopenia and coagulopathy in patients with cirrhosis prior to low-risk procedures, such as therapeutic paracentesis, thoracentesis, and routine upper endoscopy for variceal ligation, Jacqueline G. O’Leary, MD, of Dallas VA Medical Center and her three coreviewers wrote in Gastroenterology.

To optimize clot formation prior to high-risk procedures, and in patients with active bleeding, a platelet count above 50,000 per mcL is still recommended. However, it may be more meaningful to couple that platelet target with a fibrinogen level above 120 mg/dL rather than rely on the international normalized ratio (INR), the experts wrote. Not only does INR vary significantly depending on which thromboplastin is used in the test, but “correcting” INR with a fresh frozen plasma infusion does not affect thrombin production and worsens portal hypertension. Using cryoprecipitate to replenish fibrinogen has less impact on portal hypertension. “Global tests of clot formation, such as rotational thromboelastometry (ROTEM), thromboelastography (TEG), sonorheometry, and thrombin generation may eventually have a role in the evaluation of clotting in patients with cirrhosis but currently lack validated target levels,” the experts wrote.

They advised clinicians to limit the use of blood products (such as fresh frozen plasma and pooled platelet transfusions) because of cost and the risk of exacerbated portal hypertension, infection, and immunologic complications. For severe anemia and uremia, red blood cell transfusion (250 mL) can be considered. Platelet-rich plasma from one donor is less immunologically risky than a pooled platelet transfusion. Thrombopoietin agonists are “a good alternative” to platelet transfusion but require about 10 days for response. Alternative prothrombotic therapies include oral thrombopoietin receptor agonists (avatrombopag and lusutrombopag) to boost platelet count before an invasive procedure, antifibrinolytic therapy (aminocaproic acid and tranexamic acid) for persistent bleeding from mucosal oozing or puncture wounds. Desmopressin should only be considered for patients with comorbid renal failure.

For anticoagulation, the practice update recommends considering systemic heparin infusion for cirrhotic patients with symptomatic deep venous thrombosis (DVT) or portal vein thrombosis (PVT). However, the anti–factor Xa assay will not reliably monitor response if patients have low liver-derived antithrombin III (heparin cofactor). “With newly diagnosed PVT, the decision to intervene with directed therapy rests on the extent of the thrombosis, presence or absence of attributable symptoms, and the risk of bleeding and falls,” the experts stated.

Six-month follow-up imaging is recommended to assess anticoagulation efficacy. More frequent imaging can be considered for PVT patients considered at high risk for therapeutic anticoagulation. If clots do not fully resolve after 6 months of treatment, options including extending therapy with the same agent, switching to a different anticoagulant class, or receiving transjugular intrahepatic portosystemic shunt (TIPS). “The role for TIPS in PVT is evolving and may address complications like portal hypertensive bleeding, medically refractory clot, and the need for repeated banding after variceal bleeding,” the experts noted.

Prophylaxis of DVT is recommended for all hospitalized patients with cirrhosis. Vitamin K antagonists and direct-acting oral anticoagulants (dabigatran, apixaban, rivaroxaban, and edoxaban) are alternatives to heparin for anticoagulation of cirrhotic patients with either PVT and DVT, the experts wrote. However, DOACs are not recommended for most Child-Pugh B patients or for any Child-Pugh C patients.

No funding sources or conflicts of interest were reported.

SOURCE: O’Leary JG et al. Gastroenterology. 2019. doi: 10.1053/j.gastro.2019.03.070.

Cirrhosis can involve “precarious” changes in hemostatic pathways that tip the scales toward either bleeding or hypercoagulation, experts wrote in an American Gastroenterological Association Clinical Practice Update.

Based on current evidence, clinicians should not routinely correct thrombocytopenia and coagulopathy in patients with cirrhosis prior to low-risk procedures, such as therapeutic paracentesis, thoracentesis, and routine upper endoscopy for variceal ligation, Jacqueline G. O’Leary, MD, of Dallas VA Medical Center and her three coreviewers wrote in Gastroenterology.

To optimize clot formation prior to high-risk procedures, and in patients with active bleeding, a platelet count above 50,000 per mcL is still recommended. However, it may be more meaningful to couple that platelet target with a fibrinogen level above 120 mg/dL rather than rely on the international normalized ratio (INR), the experts wrote. Not only does INR vary significantly depending on which thromboplastin is used in the test, but “correcting” INR with a fresh frozen plasma infusion does not affect thrombin production and worsens portal hypertension. Using cryoprecipitate to replenish fibrinogen has less impact on portal hypertension. “Global tests of clot formation, such as rotational thromboelastometry (ROTEM), thromboelastography (TEG), sonorheometry, and thrombin generation may eventually have a role in the evaluation of clotting in patients with cirrhosis but currently lack validated target levels,” the experts wrote.

They advised clinicians to limit the use of blood products (such as fresh frozen plasma and pooled platelet transfusions) because of cost and the risk of exacerbated portal hypertension, infection, and immunologic complications. For severe anemia and uremia, red blood cell transfusion (250 mL) can be considered. Platelet-rich plasma from one donor is less immunologically risky than a pooled platelet transfusion. Thrombopoietin agonists are “a good alternative” to platelet transfusion but require about 10 days for response. Alternative prothrombotic therapies include oral thrombopoietin receptor agonists (avatrombopag and lusutrombopag) to boost platelet count before an invasive procedure, antifibrinolytic therapy (aminocaproic acid and tranexamic acid) for persistent bleeding from mucosal oozing or puncture wounds. Desmopressin should only be considered for patients with comorbid renal failure.

For anticoagulation, the practice update recommends considering systemic heparin infusion for cirrhotic patients with symptomatic deep venous thrombosis (DVT) or portal vein thrombosis (PVT). However, the anti–factor Xa assay will not reliably monitor response if patients have low liver-derived antithrombin III (heparin cofactor). “With newly diagnosed PVT, the decision to intervene with directed therapy rests on the extent of the thrombosis, presence or absence of attributable symptoms, and the risk of bleeding and falls,” the experts stated.

Six-month follow-up imaging is recommended to assess anticoagulation efficacy. More frequent imaging can be considered for PVT patients considered at high risk for therapeutic anticoagulation. If clots do not fully resolve after 6 months of treatment, options including extending therapy with the same agent, switching to a different anticoagulant class, or receiving transjugular intrahepatic portosystemic shunt (TIPS). “The role for TIPS in PVT is evolving and may address complications like portal hypertensive bleeding, medically refractory clot, and the need for repeated banding after variceal bleeding,” the experts noted.

Prophylaxis of DVT is recommended for all hospitalized patients with cirrhosis. Vitamin K antagonists and direct-acting oral anticoagulants (dabigatran, apixaban, rivaroxaban, and edoxaban) are alternatives to heparin for anticoagulation of cirrhotic patients with either PVT and DVT, the experts wrote. However, DOACs are not recommended for most Child-Pugh B patients or for any Child-Pugh C patients.

No funding sources or conflicts of interest were reported.

SOURCE: O’Leary JG et al. Gastroenterology. 2019. doi: 10.1053/j.gastro.2019.03.070.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Atypical food allergies common in IBS

Article Type
Changed
Tue, 07/16/2019 - 13:11

Among patients with irritable bowel syndrome (IBS) who tested negative for classic food allergies, confocal laser endomicroscopy showed that 70% had an immediate disruption of the intestinal barrier in response to at least one food challenge, with accompanying changes in epithelial tight junction proteins and eosinophils.

Among 108 patients who completed the study, 61% showed this atypical allergic response to wheat, wrote Annette Fritscher-Ravens, MD, PhD, of University Hospital Schleswig-Holstein in Kiel, Germany, and her associates. Strikingly, almost 70% of patients with atypical food allergies to wheat, yeast, milk, soy, or egg white who eliminated these foods from their diets showed at least an 80% improvement in IBS symptoms after 3 months. These findings were published in Gastroenterology.

Confocal laser endomicroscopy (CLE) “permits real-time detection and quantification of changes in intestinal tissues and cells, including increases in intraepithelial lymphocytes and fluid extravasation through epithelial leaks,” the investigators wrote. This approach helps clinicians objectively detect and measure gastrointestinal pathology in response to specific foods, potentially freeing IBS patients from highly restrictive diets that ease symptoms but are hard to follow, and are not meant for long-term use.

For the study, the researchers enrolled patients meeting Rome III IBS criteria who tested negative for common food antigens on immunoglobulin E serology and skin tests. During endoscopy, each patient underwent sequential duodenal challenges with 20-mL suspensions of wheat, yeast, milk, soy, and egg white, followed by CLE with biopsy.

Among 108 patients who finished the study, 76 (70%) were CLE positive. They and their first-degree relatives were significantly more likely to have atopic disorders than were CLE-negative patients (P = .001). The most common allergen was wheat (61% of patients), followed by yeast (20%), milk (9%), soy (7%), and egg white (4%). Also, nine patients reacted to two of the tested food antigens.

Compared with CLE-negative patients or controls, CLE-positive patients also had significantly more intraepithelial lymphocytes (P = .001) and postchallenge expression of claudin-2 (P = .023), which contributes to tight junction permeability and is known to be upregulated in intestinal barrier dysfunction, IBS, and inflammatory bowel disease. Conversely, levels of the tight junction protein occludin were significantly lower in duodenal biopsies from CLE-positive patients versus controls (P = .022). “Levels of mRNAs encoding inflammatory cytokines were unchanged in duodenal tissues after CLE challenge, but eosinophil degranulation increased,” the researchers wrote.

In a double-blind, randomized, crossover study, patients then excluded from their diet the antigen to which they had tested positive or consumed a sham (placebo) diet that excluded only some foods containing the antigen, with a 2-week washout period in between. The CLE-positive patients showed a 70% average improvement in Francis IBS severity score after 3 months of the intervention diet and a 76% improvement at 6 months. Strikingly, 68% of CLE-positive patients showed at least an 80% improvement in symptoms, while only 4% did not respond at all.

“Since we do not observe a histological mast cell/basophil increase or activation, and [we] do not find increased mast cell mediators (tryptase) in the duodenal fluid after positive challenge, we assume a nonclassical or atypical food allergy as cause of the mucosal reaction observed by CLE,” the researchers wrote. Other immune cell parameters remained unchanged, but additional studies are needed to see if these changes are truly absent or occur later after challenge. The researchers are conducting murine studies of eosinophilic food allergy to shed more light on these nonclassical food allergies.

Funders included the Rashid Hussein Charity Trust, the German Research Foundation, and the Leibniz Foundation. The researchers reported having no conflicts of interest.

SOURCE: Fritscher-Ravens A et al. Gastroenterology. 2019 May 14. doi: 10.1053/j.gastro.2019.03.046.

Publications
Topics
Sections

Among patients with irritable bowel syndrome (IBS) who tested negative for classic food allergies, confocal laser endomicroscopy showed that 70% had an immediate disruption of the intestinal barrier in response to at least one food challenge, with accompanying changes in epithelial tight junction proteins and eosinophils.

Among 108 patients who completed the study, 61% showed this atypical allergic response to wheat, wrote Annette Fritscher-Ravens, MD, PhD, of University Hospital Schleswig-Holstein in Kiel, Germany, and her associates. Strikingly, almost 70% of patients with atypical food allergies to wheat, yeast, milk, soy, or egg white who eliminated these foods from their diets showed at least an 80% improvement in IBS symptoms after 3 months. These findings were published in Gastroenterology.

Confocal laser endomicroscopy (CLE) “permits real-time detection and quantification of changes in intestinal tissues and cells, including increases in intraepithelial lymphocytes and fluid extravasation through epithelial leaks,” the investigators wrote. This approach helps clinicians objectively detect and measure gastrointestinal pathology in response to specific foods, potentially freeing IBS patients from highly restrictive diets that ease symptoms but are hard to follow, and are not meant for long-term use.

For the study, the researchers enrolled patients meeting Rome III IBS criteria who tested negative for common food antigens on immunoglobulin E serology and skin tests. During endoscopy, each patient underwent sequential duodenal challenges with 20-mL suspensions of wheat, yeast, milk, soy, and egg white, followed by CLE with biopsy.

Among 108 patients who finished the study, 76 (70%) were CLE positive. They and their first-degree relatives were significantly more likely to have atopic disorders than were CLE-negative patients (P = .001). The most common allergen was wheat (61% of patients), followed by yeast (20%), milk (9%), soy (7%), and egg white (4%). Also, nine patients reacted to two of the tested food antigens.

Compared with CLE-negative patients or controls, CLE-positive patients also had significantly more intraepithelial lymphocytes (P = .001) and postchallenge expression of claudin-2 (P = .023), which contributes to tight junction permeability and is known to be upregulated in intestinal barrier dysfunction, IBS, and inflammatory bowel disease. Conversely, levels of the tight junction protein occludin were significantly lower in duodenal biopsies from CLE-positive patients versus controls (P = .022). “Levels of mRNAs encoding inflammatory cytokines were unchanged in duodenal tissues after CLE challenge, but eosinophil degranulation increased,” the researchers wrote.

In a double-blind, randomized, crossover study, patients then excluded from their diet the antigen to which they had tested positive or consumed a sham (placebo) diet that excluded only some foods containing the antigen, with a 2-week washout period in between. The CLE-positive patients showed a 70% average improvement in Francis IBS severity score after 3 months of the intervention diet and a 76% improvement at 6 months. Strikingly, 68% of CLE-positive patients showed at least an 80% improvement in symptoms, while only 4% did not respond at all.

“Since we do not observe a histological mast cell/basophil increase or activation, and [we] do not find increased mast cell mediators (tryptase) in the duodenal fluid after positive challenge, we assume a nonclassical or atypical food allergy as cause of the mucosal reaction observed by CLE,” the researchers wrote. Other immune cell parameters remained unchanged, but additional studies are needed to see if these changes are truly absent or occur later after challenge. The researchers are conducting murine studies of eosinophilic food allergy to shed more light on these nonclassical food allergies.

Funders included the Rashid Hussein Charity Trust, the German Research Foundation, and the Leibniz Foundation. The researchers reported having no conflicts of interest.

SOURCE: Fritscher-Ravens A et al. Gastroenterology. 2019 May 14. doi: 10.1053/j.gastro.2019.03.046.

Among patients with irritable bowel syndrome (IBS) who tested negative for classic food allergies, confocal laser endomicroscopy showed that 70% had an immediate disruption of the intestinal barrier in response to at least one food challenge, with accompanying changes in epithelial tight junction proteins and eosinophils.

Among 108 patients who completed the study, 61% showed this atypical allergic response to wheat, wrote Annette Fritscher-Ravens, MD, PhD, of University Hospital Schleswig-Holstein in Kiel, Germany, and her associates. Strikingly, almost 70% of patients with atypical food allergies to wheat, yeast, milk, soy, or egg white who eliminated these foods from their diets showed at least an 80% improvement in IBS symptoms after 3 months. These findings were published in Gastroenterology.

Confocal laser endomicroscopy (CLE) “permits real-time detection and quantification of changes in intestinal tissues and cells, including increases in intraepithelial lymphocytes and fluid extravasation through epithelial leaks,” the investigators wrote. This approach helps clinicians objectively detect and measure gastrointestinal pathology in response to specific foods, potentially freeing IBS patients from highly restrictive diets that ease symptoms but are hard to follow, and are not meant for long-term use.

For the study, the researchers enrolled patients meeting Rome III IBS criteria who tested negative for common food antigens on immunoglobulin E serology and skin tests. During endoscopy, each patient underwent sequential duodenal challenges with 20-mL suspensions of wheat, yeast, milk, soy, and egg white, followed by CLE with biopsy.

Among 108 patients who finished the study, 76 (70%) were CLE positive. They and their first-degree relatives were significantly more likely to have atopic disorders than were CLE-negative patients (P = .001). The most common allergen was wheat (61% of patients), followed by yeast (20%), milk (9%), soy (7%), and egg white (4%). Also, nine patients reacted to two of the tested food antigens.

Compared with CLE-negative patients or controls, CLE-positive patients also had significantly more intraepithelial lymphocytes (P = .001) and postchallenge expression of claudin-2 (P = .023), which contributes to tight junction permeability and is known to be upregulated in intestinal barrier dysfunction, IBS, and inflammatory bowel disease. Conversely, levels of the tight junction protein occludin were significantly lower in duodenal biopsies from CLE-positive patients versus controls (P = .022). “Levels of mRNAs encoding inflammatory cytokines were unchanged in duodenal tissues after CLE challenge, but eosinophil degranulation increased,” the researchers wrote.

In a double-blind, randomized, crossover study, patients then excluded from their diet the antigen to which they had tested positive or consumed a sham (placebo) diet that excluded only some foods containing the antigen, with a 2-week washout period in between. The CLE-positive patients showed a 70% average improvement in Francis IBS severity score after 3 months of the intervention diet and a 76% improvement at 6 months. Strikingly, 68% of CLE-positive patients showed at least an 80% improvement in symptoms, while only 4% did not respond at all.

“Since we do not observe a histological mast cell/basophil increase or activation, and [we] do not find increased mast cell mediators (tryptase) in the duodenal fluid after positive challenge, we assume a nonclassical or atypical food allergy as cause of the mucosal reaction observed by CLE,” the researchers wrote. Other immune cell parameters remained unchanged, but additional studies are needed to see if these changes are truly absent or occur later after challenge. The researchers are conducting murine studies of eosinophilic food allergy to shed more light on these nonclassical food allergies.

Funders included the Rashid Hussein Charity Trust, the German Research Foundation, and the Leibniz Foundation. The researchers reported having no conflicts of interest.

SOURCE: Fritscher-Ravens A et al. Gastroenterology. 2019 May 14. doi: 10.1053/j.gastro.2019.03.046.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

 

Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Endoscopist personality linked to adenoma detection rate

Article Type
Changed
Thu, 06/27/2019 - 13:21

Endoscopists who described themselves as “compulsive” and “thorough” had significantly higher rates of adenoma detection, according to results from a self-reported survey of 117 physician endoscopists.

Financial incentives, malpractice concerns, and perceptions of adenoma detection rate as a quality metric were not associated with endoscopists’ detection rates in the survey.

“Adenoma detection rates were higher among physicians who described themselves as more compulsive or thorough, and among those who reported feeling rushed or having difficulty accomplishing goals,” Ghideon Ezaz, MD, of Beth Israel Deaconess Medical Center in Boston and associates wrote in Clinical Gastroenterology and Hepatology.

These feelings were related to withdrawal times rather than daily procedure volume. “We hypothesize that performing a meticulous examination is mentally taxing and can cause a physician to feel rushed or perceive that it is difficult to keep pace or accomplish goals,” the researchers wrote.

Adenoma detection rates vary widely among physicians – up to threefold in some studies. Researchers have failed to attribute most of this discrepancy to seemingly obvious factors such as the type of specialty training an endoscopist completes. The traditional fee-for-service payment model is likely a culprit since physicians are paid for performing as many colonoscopies as possible rather than for procedural quality. Other potential variables include personality traits and endoscopists’ knowledge and views on the importance of adenoma detection rates.

To examine the roles of these factors in adenoma detection rates, Dr. Ezaz and coinvestigators used electronic health records data from four health systems in Boston, Pittsburgh, North Carolina, and Seattle. Detection rates were adjusted to control for differences among patient populations. Next, the researchers surveyed the physicians who performed the endoscopies about their financial motivations, knowledge and perceptions of colonoscopy quality, and personality traits.

Among 117 physicians surveyed, the median risk-adjusted adenoma detection rate was 29.3%, with an interquartile range of 24.1%-35.5%. “We found no significant association between adenoma detection rate and financial incentives, malpractice concerns, or physicians’ perceptions of adenoma detection rate as a quality metric,” the researchers wrote.

In contrast, endoscopists who described themselves as either much or somewhat more compulsive than their peers had significantly higher median adjusted rates of adenoma detection than did endoscopists who described themselves as about the same or somewhat less compulsive than others. These adenoma detection rates, in respective order, were 33.1%, 32.9%, 26.4%, and 27.3% (P = .0019). Adenoma detection rates also were significantly higher among physicians who described themselves as more thorough than their peers, who said they felt rushed during endoscopy, and who reported having difficulty pacing themselves, accomplishing goals, or managing unforeseen situations.

A secondary analysis revealed the same links between personality traits and adenomas per colonoscopy. The findings support an expert’s prior assertion (Gastrointest Endosc. 2007 Jan;65[1]:145-50) that the best endoscopists are “slow, careful, and compulsive,” the researchers noted. They recommended nurturing “meticulousness and attention to detail” during training and evaluating trainees based on these characteristics.

The National Cancer Institute provided funding. The researchers reported having no conflicts of interest.
 

SOURCE: Ezaz G et al. Clin Gastroenterol Hepatol. 2018 Oct 13. doi: 10.1016/j.cgh.2018.10.019.

Publications
Topics
Sections

Endoscopists who described themselves as “compulsive” and “thorough” had significantly higher rates of adenoma detection, according to results from a self-reported survey of 117 physician endoscopists.

Financial incentives, malpractice concerns, and perceptions of adenoma detection rate as a quality metric were not associated with endoscopists’ detection rates in the survey.

“Adenoma detection rates were higher among physicians who described themselves as more compulsive or thorough, and among those who reported feeling rushed or having difficulty accomplishing goals,” Ghideon Ezaz, MD, of Beth Israel Deaconess Medical Center in Boston and associates wrote in Clinical Gastroenterology and Hepatology.

These feelings were related to withdrawal times rather than daily procedure volume. “We hypothesize that performing a meticulous examination is mentally taxing and can cause a physician to feel rushed or perceive that it is difficult to keep pace or accomplish goals,” the researchers wrote.

Adenoma detection rates vary widely among physicians – up to threefold in some studies. Researchers have failed to attribute most of this discrepancy to seemingly obvious factors such as the type of specialty training an endoscopist completes. The traditional fee-for-service payment model is likely a culprit since physicians are paid for performing as many colonoscopies as possible rather than for procedural quality. Other potential variables include personality traits and endoscopists’ knowledge and views on the importance of adenoma detection rates.

To examine the roles of these factors in adenoma detection rates, Dr. Ezaz and coinvestigators used electronic health records data from four health systems in Boston, Pittsburgh, North Carolina, and Seattle. Detection rates were adjusted to control for differences among patient populations. Next, the researchers surveyed the physicians who performed the endoscopies about their financial motivations, knowledge and perceptions of colonoscopy quality, and personality traits.

Among 117 physicians surveyed, the median risk-adjusted adenoma detection rate was 29.3%, with an interquartile range of 24.1%-35.5%. “We found no significant association between adenoma detection rate and financial incentives, malpractice concerns, or physicians’ perceptions of adenoma detection rate as a quality metric,” the researchers wrote.

In contrast, endoscopists who described themselves as either much or somewhat more compulsive than their peers had significantly higher median adjusted rates of adenoma detection than did endoscopists who described themselves as about the same or somewhat less compulsive than others. These adenoma detection rates, in respective order, were 33.1%, 32.9%, 26.4%, and 27.3% (P = .0019). Adenoma detection rates also were significantly higher among physicians who described themselves as more thorough than their peers, who said they felt rushed during endoscopy, and who reported having difficulty pacing themselves, accomplishing goals, or managing unforeseen situations.

A secondary analysis revealed the same links between personality traits and adenomas per colonoscopy. The findings support an expert’s prior assertion (Gastrointest Endosc. 2007 Jan;65[1]:145-50) that the best endoscopists are “slow, careful, and compulsive,” the researchers noted. They recommended nurturing “meticulousness and attention to detail” during training and evaluating trainees based on these characteristics.

The National Cancer Institute provided funding. The researchers reported having no conflicts of interest.
 

SOURCE: Ezaz G et al. Clin Gastroenterol Hepatol. 2018 Oct 13. doi: 10.1016/j.cgh.2018.10.019.

Endoscopists who described themselves as “compulsive” and “thorough” had significantly higher rates of adenoma detection, according to results from a self-reported survey of 117 physician endoscopists.

Financial incentives, malpractice concerns, and perceptions of adenoma detection rate as a quality metric were not associated with endoscopists’ detection rates in the survey.

“Adenoma detection rates were higher among physicians who described themselves as more compulsive or thorough, and among those who reported feeling rushed or having difficulty accomplishing goals,” Ghideon Ezaz, MD, of Beth Israel Deaconess Medical Center in Boston and associates wrote in Clinical Gastroenterology and Hepatology.

These feelings were related to withdrawal times rather than daily procedure volume. “We hypothesize that performing a meticulous examination is mentally taxing and can cause a physician to feel rushed or perceive that it is difficult to keep pace or accomplish goals,” the researchers wrote.

Adenoma detection rates vary widely among physicians – up to threefold in some studies. Researchers have failed to attribute most of this discrepancy to seemingly obvious factors such as the type of specialty training an endoscopist completes. The traditional fee-for-service payment model is likely a culprit since physicians are paid for performing as many colonoscopies as possible rather than for procedural quality. Other potential variables include personality traits and endoscopists’ knowledge and views on the importance of adenoma detection rates.

To examine the roles of these factors in adenoma detection rates, Dr. Ezaz and coinvestigators used electronic health records data from four health systems in Boston, Pittsburgh, North Carolina, and Seattle. Detection rates were adjusted to control for differences among patient populations. Next, the researchers surveyed the physicians who performed the endoscopies about their financial motivations, knowledge and perceptions of colonoscopy quality, and personality traits.

Among 117 physicians surveyed, the median risk-adjusted adenoma detection rate was 29.3%, with an interquartile range of 24.1%-35.5%. “We found no significant association between adenoma detection rate and financial incentives, malpractice concerns, or physicians’ perceptions of adenoma detection rate as a quality metric,” the researchers wrote.

In contrast, endoscopists who described themselves as either much or somewhat more compulsive than their peers had significantly higher median adjusted rates of adenoma detection than did endoscopists who described themselves as about the same or somewhat less compulsive than others. These adenoma detection rates, in respective order, were 33.1%, 32.9%, 26.4%, and 27.3% (P = .0019). Adenoma detection rates also were significantly higher among physicians who described themselves as more thorough than their peers, who said they felt rushed during endoscopy, and who reported having difficulty pacing themselves, accomplishing goals, or managing unforeseen situations.

A secondary analysis revealed the same links between personality traits and adenomas per colonoscopy. The findings support an expert’s prior assertion (Gastrointest Endosc. 2007 Jan;65[1]:145-50) that the best endoscopists are “slow, careful, and compulsive,” the researchers noted. They recommended nurturing “meticulousness and attention to detail” during training and evaluating trainees based on these characteristics.

The National Cancer Institute provided funding. The researchers reported having no conflicts of interest.
 

SOURCE: Ezaz G et al. Clin Gastroenterol Hepatol. 2018 Oct 13. doi: 10.1016/j.cgh.2018.10.019.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Endoscopists’ self-reported personality traits correlated significantly with their rates of adenoma detection.

Major finding: Self-reported compulsiveness, thoroughness, feeling rushed during endoscopy, and having difficulty pacing oneself, meeting goals, or managing unforeseen situations all correlated with significantly higher rates of adenoma detection, while financial incentives, malpractice concerns, and physicians’ perception of the value of adenoma detection did not.

Study details: Surveys of 117 physician endoscopists and analyses of electronic health record from four geographically diverse health centers where they worked.

Disclosures: The National Cancer Institute provided funding. The researchers reported having no conflicts of interest.

Source: Ezaz G et al. Clin Gastroenterol Hepatol. 2018 Oct 13. doi: 10.1016/j.cgh.2018.10.019.

Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

COPD exacerbations associated with poor sleep quality

Article Type
Changed
Wed, 06/19/2019 - 15:16

 

Poor subjective sleep quality was associated with subsequent symptomatic exacerbations of chronic obstructive pulmonary disease in an 18-month prospective study of 480 patients.

©marcociannarel/Thinkstock

“Poor sleep quality in COPD has previously been associated with reduced health-related quality of life and reduced physical activity during the day,” wrote Matthew Shorofsky, MD, of McGill University, Montreal, and associates. Their report is in CHEST. “However, to our knowledge, this is the first population-based longitudinal study evaluating exacerbation risk in relation to subjective sleep disturbances and assessing previously diagnosed and undiagnosed COPD.”

The study included participants enrolled in the Canadian Respiratory Research Network and the Canadian Cohort Obstructive Lung Disease (CanCOLD) study who had COPD, available baseline PSQI scores, and 18 months of follow-up data. The PSQI includes 19 questions on sleep quality, latency, duration, efficiency, disturbances, use of sleep medications, and daytime dysfunction. Total score ranges between 0 and 21, and a score above 5 is considered poor sleep. Online patient surveys and quarterly phone interviews were used to track symptom-based exacerbations (at least 48 hours of increased dyspnea, sputum volume, or sputum purulence) and event-based exacerbations (a symptom-based exacerbation plus the use antibiotics or corticosteroids or health services).

At baseline, 203 patients met the PSQI threshold for poor sleep quality. During follow-up, 185 patients had at least one COPD exacerbation. Poor sleep at baseline was significantly more prevalent among patients with symptoms-based COPD exacerbations (50.3%) than among patients without symptoms-based exacerbations (37.3%; P = .01). Poor baseline sleep quality remained a significant risk factor for symptom-based exacerbations of COPD even after the researchers accounted for the effect of age, gender, body mass index, smoking, depression, angina, baseline inhaled respiratory medications, forced expiratory volume in 1 second %predicted, and modified Medical Research Council (mMRC) dyspnea scale (adjusted risk ratio, 1.09; 95% confidence interval, 1.01-1.18; P =.02).

Patients with at least one symptomatic exacerbation of COPD were significantly more likely to meet the threshold for poor sleep quality on the Pittsburgh Sleep Quality Index and have significantly higher median PSQI scores compared with patients without exacerbations (6.0 [interquartile range, 3.0 to 8.0] vs. 5.0 [2.0 to 7.0]; P = .01). Poor baseline sleep quality also was associated with event-based exacerbations and with a shorter time to symptoms-based exacerbations. Sleep disturbances, such as rising to void or experiencing respiratory issues or pain during sleep, correlated most strongly with symptoms-based exacerbations.

Several factors could explain the link between poor sleep quality and COPD exacerbations, the investigators wrote. Patients with inadequately controlled COPD have more frequent and unstable respiratory symptoms, which could disrupt sleep either directly or indirectly (secondary to medication use or anxiety, for example). Conversely, sleep disruption can impede immune function and increase systemic inflammation, which might worsen COPD control and increase exacerbation risk. Poor sleep can impair memory and cognition, “potentially fostering medication nonadherence and symptom flare-up, especially in the older COPD population.” Although the link is poorly understood, patients with COPD often have comorbid obstructive sleep apnea (OSA), which is associated with COPD exacerbations, the researchers wrote. Treating OSA is associated with improved COPD morbidity and fewer exacerbations and hospitalizations.

The researchers acknowledged limitations to their study design. “Individuals with asthma or other obstructive lung diseases could not be definitively excluded; methacholine challenges were not performed. However, analyses excluding self-reported asthma were consistent with our main results. Second, because definitions of COPD exacerbation vary among studies, comparison may be limited, but CanCOLD used a standard definition, as recommended by GOLD.”

The CanCOLD study has received funding from the Canadian Respiratory Research Network, Astra Zeneca Canada, Boehringer Ingelheim Canada, GlaxoSmithKline Canada, Novartis, Merck Nycomed, Pfizer Canada, and Theratechnologies. Dr. Shorofsky had no disclosures. Several coinvestigators reported ties to GlaxoSmithKline, Novartis, Boehringer Ingelheim, Merck, Almirall, and Theratechnologies.

SOURCE: Shorofsky M et al. CHEST. 2019 May 28. doi: 10.1016/j.chest.2019.04.132.

Publications
Topics
Sections

 

Poor subjective sleep quality was associated with subsequent symptomatic exacerbations of chronic obstructive pulmonary disease in an 18-month prospective study of 480 patients.

©marcociannarel/Thinkstock

“Poor sleep quality in COPD has previously been associated with reduced health-related quality of life and reduced physical activity during the day,” wrote Matthew Shorofsky, MD, of McGill University, Montreal, and associates. Their report is in CHEST. “However, to our knowledge, this is the first population-based longitudinal study evaluating exacerbation risk in relation to subjective sleep disturbances and assessing previously diagnosed and undiagnosed COPD.”

The study included participants enrolled in the Canadian Respiratory Research Network and the Canadian Cohort Obstructive Lung Disease (CanCOLD) study who had COPD, available baseline PSQI scores, and 18 months of follow-up data. The PSQI includes 19 questions on sleep quality, latency, duration, efficiency, disturbances, use of sleep medications, and daytime dysfunction. Total score ranges between 0 and 21, and a score above 5 is considered poor sleep. Online patient surveys and quarterly phone interviews were used to track symptom-based exacerbations (at least 48 hours of increased dyspnea, sputum volume, or sputum purulence) and event-based exacerbations (a symptom-based exacerbation plus the use antibiotics or corticosteroids or health services).

At baseline, 203 patients met the PSQI threshold for poor sleep quality. During follow-up, 185 patients had at least one COPD exacerbation. Poor sleep at baseline was significantly more prevalent among patients with symptoms-based COPD exacerbations (50.3%) than among patients without symptoms-based exacerbations (37.3%; P = .01). Poor baseline sleep quality remained a significant risk factor for symptom-based exacerbations of COPD even after the researchers accounted for the effect of age, gender, body mass index, smoking, depression, angina, baseline inhaled respiratory medications, forced expiratory volume in 1 second %predicted, and modified Medical Research Council (mMRC) dyspnea scale (adjusted risk ratio, 1.09; 95% confidence interval, 1.01-1.18; P =.02).

Patients with at least one symptomatic exacerbation of COPD were significantly more likely to meet the threshold for poor sleep quality on the Pittsburgh Sleep Quality Index and have significantly higher median PSQI scores compared with patients without exacerbations (6.0 [interquartile range, 3.0 to 8.0] vs. 5.0 [2.0 to 7.0]; P = .01). Poor baseline sleep quality also was associated with event-based exacerbations and with a shorter time to symptoms-based exacerbations. Sleep disturbances, such as rising to void or experiencing respiratory issues or pain during sleep, correlated most strongly with symptoms-based exacerbations.

Several factors could explain the link between poor sleep quality and COPD exacerbations, the investigators wrote. Patients with inadequately controlled COPD have more frequent and unstable respiratory symptoms, which could disrupt sleep either directly or indirectly (secondary to medication use or anxiety, for example). Conversely, sleep disruption can impede immune function and increase systemic inflammation, which might worsen COPD control and increase exacerbation risk. Poor sleep can impair memory and cognition, “potentially fostering medication nonadherence and symptom flare-up, especially in the older COPD population.” Although the link is poorly understood, patients with COPD often have comorbid obstructive sleep apnea (OSA), which is associated with COPD exacerbations, the researchers wrote. Treating OSA is associated with improved COPD morbidity and fewer exacerbations and hospitalizations.

The researchers acknowledged limitations to their study design. “Individuals with asthma or other obstructive lung diseases could not be definitively excluded; methacholine challenges were not performed. However, analyses excluding self-reported asthma were consistent with our main results. Second, because definitions of COPD exacerbation vary among studies, comparison may be limited, but CanCOLD used a standard definition, as recommended by GOLD.”

The CanCOLD study has received funding from the Canadian Respiratory Research Network, Astra Zeneca Canada, Boehringer Ingelheim Canada, GlaxoSmithKline Canada, Novartis, Merck Nycomed, Pfizer Canada, and Theratechnologies. Dr. Shorofsky had no disclosures. Several coinvestigators reported ties to GlaxoSmithKline, Novartis, Boehringer Ingelheim, Merck, Almirall, and Theratechnologies.

SOURCE: Shorofsky M et al. CHEST. 2019 May 28. doi: 10.1016/j.chest.2019.04.132.

 

Poor subjective sleep quality was associated with subsequent symptomatic exacerbations of chronic obstructive pulmonary disease in an 18-month prospective study of 480 patients.

©marcociannarel/Thinkstock

“Poor sleep quality in COPD has previously been associated with reduced health-related quality of life and reduced physical activity during the day,” wrote Matthew Shorofsky, MD, of McGill University, Montreal, and associates. Their report is in CHEST. “However, to our knowledge, this is the first population-based longitudinal study evaluating exacerbation risk in relation to subjective sleep disturbances and assessing previously diagnosed and undiagnosed COPD.”

The study included participants enrolled in the Canadian Respiratory Research Network and the Canadian Cohort Obstructive Lung Disease (CanCOLD) study who had COPD, available baseline PSQI scores, and 18 months of follow-up data. The PSQI includes 19 questions on sleep quality, latency, duration, efficiency, disturbances, use of sleep medications, and daytime dysfunction. Total score ranges between 0 and 21, and a score above 5 is considered poor sleep. Online patient surveys and quarterly phone interviews were used to track symptom-based exacerbations (at least 48 hours of increased dyspnea, sputum volume, or sputum purulence) and event-based exacerbations (a symptom-based exacerbation plus the use antibiotics or corticosteroids or health services).

At baseline, 203 patients met the PSQI threshold for poor sleep quality. During follow-up, 185 patients had at least one COPD exacerbation. Poor sleep at baseline was significantly more prevalent among patients with symptoms-based COPD exacerbations (50.3%) than among patients without symptoms-based exacerbations (37.3%; P = .01). Poor baseline sleep quality remained a significant risk factor for symptom-based exacerbations of COPD even after the researchers accounted for the effect of age, gender, body mass index, smoking, depression, angina, baseline inhaled respiratory medications, forced expiratory volume in 1 second %predicted, and modified Medical Research Council (mMRC) dyspnea scale (adjusted risk ratio, 1.09; 95% confidence interval, 1.01-1.18; P =.02).

Patients with at least one symptomatic exacerbation of COPD were significantly more likely to meet the threshold for poor sleep quality on the Pittsburgh Sleep Quality Index and have significantly higher median PSQI scores compared with patients without exacerbations (6.0 [interquartile range, 3.0 to 8.0] vs. 5.0 [2.0 to 7.0]; P = .01). Poor baseline sleep quality also was associated with event-based exacerbations and with a shorter time to symptoms-based exacerbations. Sleep disturbances, such as rising to void or experiencing respiratory issues or pain during sleep, correlated most strongly with symptoms-based exacerbations.

Several factors could explain the link between poor sleep quality and COPD exacerbations, the investigators wrote. Patients with inadequately controlled COPD have more frequent and unstable respiratory symptoms, which could disrupt sleep either directly or indirectly (secondary to medication use or anxiety, for example). Conversely, sleep disruption can impede immune function and increase systemic inflammation, which might worsen COPD control and increase exacerbation risk. Poor sleep can impair memory and cognition, “potentially fostering medication nonadherence and symptom flare-up, especially in the older COPD population.” Although the link is poorly understood, patients with COPD often have comorbid obstructive sleep apnea (OSA), which is associated with COPD exacerbations, the researchers wrote. Treating OSA is associated with improved COPD morbidity and fewer exacerbations and hospitalizations.

The researchers acknowledged limitations to their study design. “Individuals with asthma or other obstructive lung diseases could not be definitively excluded; methacholine challenges were not performed. However, analyses excluding self-reported asthma were consistent with our main results. Second, because definitions of COPD exacerbation vary among studies, comparison may be limited, but CanCOLD used a standard definition, as recommended by GOLD.”

The CanCOLD study has received funding from the Canadian Respiratory Research Network, Astra Zeneca Canada, Boehringer Ingelheim Canada, GlaxoSmithKline Canada, Novartis, Merck Nycomed, Pfizer Canada, and Theratechnologies. Dr. Shorofsky had no disclosures. Several coinvestigators reported ties to GlaxoSmithKline, Novartis, Boehringer Ingelheim, Merck, Almirall, and Theratechnologies.

SOURCE: Shorofsky M et al. CHEST. 2019 May 28. doi: 10.1016/j.chest.2019.04.132.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CHEST

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

 

Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Cognitive decline sped up after CHD

Targeting CHD might slow cognitive decline
Article Type
Changed
Wed, 05/06/2020 - 12:24

 

Cognitive decline accelerates in the long term after patients develop coronary heart disease (CHD), according to the results of a large prospective study with a median of 12 years of follow-up.

“We found that incident CHD was significantly associated with faster post–CHD-diagnosis cognitive decline, but not pre–CHD-diagnosis or short-term cognitive decline after the event,” Wuxiang Xie, PhD, of Peking University Health Science Center, Beijing, and associates wrote in the Journal of the American College of Cardiology. Linear mixed models showed that cognitive decline sped up during the year after incident CHD.

Past research had suggested a link between accelerated cognitive decline and CHD, but the temporal pattern of the relationship was unclear. For the study, Dr. Xie and associates followed 7,888 adults from the English Longitudinal Study of Aging who were an average of 62 years old and had no history of stroke, MI, angina, or dementia (Alzheimer’s disease or otherwise). All participants underwent a baseline cognitive assessment for verbal memory, semantic fluency, and temporal orientation, plus a median of six follow-up assessments.

In all, 480 (6%) participants developed CHD during follow-up. Their rate of cognitive decline remained constant before and immediately after their CHD diagnosis, but in subsequent years, they experienced significant accelerations in loss of global cognitive function, verbal memory, and temporal orientation even after accounting for time and many demographic and clinical variables. For example, the slope representing temporal change in global cognitive score decreased by a mean of 0.039 per year, compared with the pre-CHD slope (slope difference, –0.039; 95% confidence interval, –0.063 to –0.015; P =. 002). Semantic fluency also declined faster after CHD, but the difference, compared with before CHD, did not reach statistical significance (P = .11).

Individuals without CHD showed no such accelerations in cognitive decline throughout follow-up in adjusted models, the researchers wrote. “Based on repeated cognitive measurements over a long follow-up period, this study revealed a reliable and robust trajectory of cognitive decline [after CHD]. Future studies are warranted to determine the precise mechanisms linking incident CHD to cognitive decline.”

Funders included the National Natural Science Foundation of China, the Beijing Natural Science Foundation, and the Newton International Fellowship from the Academy of Medical Sciences. The researchers reported having no relevant financial disclosures.

SOURCE: Xie W et al. J Amer Coll Cardiol. 2019 Jun 17. doi: 10.1016/j.jacc.2019.04.019.

Body

 

The findings “highlight the role of cardiovascular risk factors and cardiovascular health as crucial determinants of cognitive trajectories in later life,” wrote Suvi P. Rovio, PhD; Katja Pahkala, PhD; and Olli T. Raitakari, MD, PhD. For example, accelerated declines in verbal memory might indicate a specific vulnerability to vascular changes within the medial temporal lobe and hippocampus.

The fact that cognitive decline did not accelerate immediately after coronary heart disease suggests that CHD itself does not acutely alter the brain, such as by causing microinfarcts, they commented. Instead, CHD might induce longer-term shifts in cerebral vascular function by affecting the blood-brain barrier or perfusion and oxidation in the brain. While these complex relationships need further untangling, the study suggests interventions that cut CHD risk also might help prevent cognitive decline itself and slow the rate of cognitive decline if it occurs.

Dr. Rovio, Dr. Pahkala, and Dr. Raitakari are at the University of Turku (Finland) and Turku University Hospital. These comments are adapted from an editorial accompanying the article by Xie et al. (J Amer Coll Cardiol. 2019 Jun 17. doi: 10.1016/j.jacc.2019.04.020). They reported having no relevant financial disclosures.

Publications
Topics
Sections
Body

 

The findings “highlight the role of cardiovascular risk factors and cardiovascular health as crucial determinants of cognitive trajectories in later life,” wrote Suvi P. Rovio, PhD; Katja Pahkala, PhD; and Olli T. Raitakari, MD, PhD. For example, accelerated declines in verbal memory might indicate a specific vulnerability to vascular changes within the medial temporal lobe and hippocampus.

The fact that cognitive decline did not accelerate immediately after coronary heart disease suggests that CHD itself does not acutely alter the brain, such as by causing microinfarcts, they commented. Instead, CHD might induce longer-term shifts in cerebral vascular function by affecting the blood-brain barrier or perfusion and oxidation in the brain. While these complex relationships need further untangling, the study suggests interventions that cut CHD risk also might help prevent cognitive decline itself and slow the rate of cognitive decline if it occurs.

Dr. Rovio, Dr. Pahkala, and Dr. Raitakari are at the University of Turku (Finland) and Turku University Hospital. These comments are adapted from an editorial accompanying the article by Xie et al. (J Amer Coll Cardiol. 2019 Jun 17. doi: 10.1016/j.jacc.2019.04.020). They reported having no relevant financial disclosures.

Body

 

The findings “highlight the role of cardiovascular risk factors and cardiovascular health as crucial determinants of cognitive trajectories in later life,” wrote Suvi P. Rovio, PhD; Katja Pahkala, PhD; and Olli T. Raitakari, MD, PhD. For example, accelerated declines in verbal memory might indicate a specific vulnerability to vascular changes within the medial temporal lobe and hippocampus.

The fact that cognitive decline did not accelerate immediately after coronary heart disease suggests that CHD itself does not acutely alter the brain, such as by causing microinfarcts, they commented. Instead, CHD might induce longer-term shifts in cerebral vascular function by affecting the blood-brain barrier or perfusion and oxidation in the brain. While these complex relationships need further untangling, the study suggests interventions that cut CHD risk also might help prevent cognitive decline itself and slow the rate of cognitive decline if it occurs.

Dr. Rovio, Dr. Pahkala, and Dr. Raitakari are at the University of Turku (Finland) and Turku University Hospital. These comments are adapted from an editorial accompanying the article by Xie et al. (J Amer Coll Cardiol. 2019 Jun 17. doi: 10.1016/j.jacc.2019.04.020). They reported having no relevant financial disclosures.

Title
Targeting CHD might slow cognitive decline
Targeting CHD might slow cognitive decline

 

Cognitive decline accelerates in the long term after patients develop coronary heart disease (CHD), according to the results of a large prospective study with a median of 12 years of follow-up.

“We found that incident CHD was significantly associated with faster post–CHD-diagnosis cognitive decline, but not pre–CHD-diagnosis or short-term cognitive decline after the event,” Wuxiang Xie, PhD, of Peking University Health Science Center, Beijing, and associates wrote in the Journal of the American College of Cardiology. Linear mixed models showed that cognitive decline sped up during the year after incident CHD.

Past research had suggested a link between accelerated cognitive decline and CHD, but the temporal pattern of the relationship was unclear. For the study, Dr. Xie and associates followed 7,888 adults from the English Longitudinal Study of Aging who were an average of 62 years old and had no history of stroke, MI, angina, or dementia (Alzheimer’s disease or otherwise). All participants underwent a baseline cognitive assessment for verbal memory, semantic fluency, and temporal orientation, plus a median of six follow-up assessments.

In all, 480 (6%) participants developed CHD during follow-up. Their rate of cognitive decline remained constant before and immediately after their CHD diagnosis, but in subsequent years, they experienced significant accelerations in loss of global cognitive function, verbal memory, and temporal orientation even after accounting for time and many demographic and clinical variables. For example, the slope representing temporal change in global cognitive score decreased by a mean of 0.039 per year, compared with the pre-CHD slope (slope difference, –0.039; 95% confidence interval, –0.063 to –0.015; P =. 002). Semantic fluency also declined faster after CHD, but the difference, compared with before CHD, did not reach statistical significance (P = .11).

Individuals without CHD showed no such accelerations in cognitive decline throughout follow-up in adjusted models, the researchers wrote. “Based on repeated cognitive measurements over a long follow-up period, this study revealed a reliable and robust trajectory of cognitive decline [after CHD]. Future studies are warranted to determine the precise mechanisms linking incident CHD to cognitive decline.”

Funders included the National Natural Science Foundation of China, the Beijing Natural Science Foundation, and the Newton International Fellowship from the Academy of Medical Sciences. The researchers reported having no relevant financial disclosures.

SOURCE: Xie W et al. J Amer Coll Cardiol. 2019 Jun 17. doi: 10.1016/j.jacc.2019.04.019.

 

Cognitive decline accelerates in the long term after patients develop coronary heart disease (CHD), according to the results of a large prospective study with a median of 12 years of follow-up.

“We found that incident CHD was significantly associated with faster post–CHD-diagnosis cognitive decline, but not pre–CHD-diagnosis or short-term cognitive decline after the event,” Wuxiang Xie, PhD, of Peking University Health Science Center, Beijing, and associates wrote in the Journal of the American College of Cardiology. Linear mixed models showed that cognitive decline sped up during the year after incident CHD.

Past research had suggested a link between accelerated cognitive decline and CHD, but the temporal pattern of the relationship was unclear. For the study, Dr. Xie and associates followed 7,888 adults from the English Longitudinal Study of Aging who were an average of 62 years old and had no history of stroke, MI, angina, or dementia (Alzheimer’s disease or otherwise). All participants underwent a baseline cognitive assessment for verbal memory, semantic fluency, and temporal orientation, plus a median of six follow-up assessments.

In all, 480 (6%) participants developed CHD during follow-up. Their rate of cognitive decline remained constant before and immediately after their CHD diagnosis, but in subsequent years, they experienced significant accelerations in loss of global cognitive function, verbal memory, and temporal orientation even after accounting for time and many demographic and clinical variables. For example, the slope representing temporal change in global cognitive score decreased by a mean of 0.039 per year, compared with the pre-CHD slope (slope difference, –0.039; 95% confidence interval, –0.063 to –0.015; P =. 002). Semantic fluency also declined faster after CHD, but the difference, compared with before CHD, did not reach statistical significance (P = .11).

Individuals without CHD showed no such accelerations in cognitive decline throughout follow-up in adjusted models, the researchers wrote. “Based on repeated cognitive measurements over a long follow-up period, this study revealed a reliable and robust trajectory of cognitive decline [after CHD]. Future studies are warranted to determine the precise mechanisms linking incident CHD to cognitive decline.”

Funders included the National Natural Science Foundation of China, the Beijing Natural Science Foundation, and the Newton International Fellowship from the Academy of Medical Sciences. The researchers reported having no relevant financial disclosures.

SOURCE: Xie W et al. J Amer Coll Cardiol. 2019 Jun 17. doi: 10.1016/j.jacc.2019.04.019.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF THE AMERICAN COLLEGE OF CARDIOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Tofacitinib upped herpes zoster risk in ulcerative colitis

How safe is tofacitinib?
Article Type
Changed
Thu, 06/06/2019 - 15:17

 

Among patients with moderate to severe ulcerative colitis, a median of 1.4 years and up to 4.4 years of tofacitinib therapy was safe apart from a dose-related increase in risk of herpes zoster infection, according to an integrated analysis of data from five clinical trials.

clsgraphics/iStockphoto

Compared with placebo, a 5-mg twice-daily maintenance dose of tofacitinib (Xeljanz) produced a 2.1-fold greater risk of herpes zoster infection (95% confidence interval, 0.4-6.0), while a 10-mg, twice-daily dose produced a statistically significant 6.6-fold increase in incidence (95% CI, 3.2-12.2).

With the exception of the higher incidence rate of herpes zoster, “in the overall cohort, the safety profile of tofacitinib was generally similar to that of tumor necrosis factor inhibitor therapies,” wrote William J. Sandborn, MD, director of the inflammatory bowel disease center and professor of medicine, at the University of California, San Diego, and associates. The findings were published in Clinical Gastroenterology and Hepatology.

Tofacitinib is an oral, small-molecular Janus kinase inhibitor approved in the United States for treating moderate to severe ulcerative colitis, as well as rheumatoid and psoriatic arthritis. The recommended ulcerative colitis dose is 10 mg twice daily for at least 8 weeks (induction therapy) followed by 5 or 10 mg twice daily (maintenance). The safety of tofacitinib has been studied in patients with rheumatoid arthritis through 9 years of treatment. To begin a similar undertaking in ulcerative colitis, Dr. Sandborn and associates pooled data from three 8-week, double-blind, placebo-controlled induction trials, as well as one 52-week, double-blind, placebo-controlled maintenance trial and one ongoing open-label trial. All patients received twice-daily tofacitinib (5 mg or 10 mg) or placebo.

Among 1,157 tofacitinib recipients in the pooled analysis, 84% received an average of 10 mg twice daily. For every 100 person-years of tofacitinib exposure, there were an estimated 2.0 serious infections, 1.3 opportunistic infections, 4.1 herpes zoster infections, 1.4 malignancies (including nonmelanoma skin cancer, which had an incidence of 0.7), 0.2 major adverse cardiovascular events, and 0.2 gastrointestinal perforations. The likelihood of these events did not increase with time on tofacitinib, the researchers said.

 

 


Worsening ulcerative colitis was the most common serious adverse event for patients who received both induction and maintenance therapy. For patients on maintenance therapy, only herpes zoster infection had a higher incidence than placebo, which reached statistical significance at the 10-mg dose. These safety findings resemble those in rheumatoid arthritis trials of tofacitinib, and apart from herpes zoster, they also resemble safety data for vedolizumab (an integrin receptor antagonist), and anti-tumor necrosis factor agents in ulcerative colitis, the researchers wrote.

There were four deaths during the entire tofacitinib ulcerative colitis program, for an incidence rate of 0.2 per 100 person-years of exposure. All occurred in patients receiving 10 mg twice daily. Causes of death were dissecting aortic aneurysm, hepatic angiosarcoma, acute myeloid leukemia, and pulmonary embolism in a patient with cholangiocarcinoma that had metastasized to the peritoneum. Recently, concerns about pulmonary embolism have led the European Medicines Agency (EMA) to recommend against the use of 10-mg twice daily tofacitinib dose in patients at increased risk for pulmonary embolism.

“Compared with prior experience with tofacitinib in rheumatoid arthritis, no new or unexpected safety signals were identified,” the researchers concluded. “These safety findings support the long-term use of tofacitinib 5 and 10 mg twice daily in patients with moderately to severely active” ulcerative colitis.

Pfizer makes tofacitinib, funded the individual trials, and paid for medical writing. Dr. Sandborn disclosed grants, personal fees, and nonfinancial support from Pfizer and many other pharmaceutical companies.

SOURCE: Sandborn WJ et al. Clin Gastroenterol Hepatol. 2018 Nov 23. doi: 10.1016/j.cgh.2018.11.035.

Body

 

As new mechanisms of action become available for ulcerative colitis (UC) drugs, clinicians must weigh the risks versus benefits (i.e., safety vs. efficacy). In this article, Sandborn and colleagues provide additional information on the safety profile of tofacitinib. They report an increased risk of herpes zoster that was dose dependent (sixfold increase on 10 mg twice daily). The overall safety profile was reassuring, is similar to the rheumatoid arthritis population treated with tofacitinib, and is in line with the safety profile of anti-TNF antibodies (excluding the increase risk of zoster). With a nonlive zoster vaccine now available, some have advocated vaccinating all patients being started on tofacitinib. However, there is a theoretical risk of disease exacerbation and ongoing studies that will hopefully answer this important question.

Dr. David A. Schwartz
Another emerging safety concern with tofacitinib involves venous thromboembolism (VTE). The Food and Drug Administration recently issued a warning based on the findings of a safety trial in rheumatoid arthritis in which they found an increased risk of PE and death in those on 10-mg twice-daily dose. The exact details of the risk have yet to be released. Enrollment in the trial required patients aged over 50 years with at least one cardiovascular risk factor. The European regulatory body (EMA) recently forbade the use of the 10-mg dose of tofacitinib for anyone at increased risk for VTE. It is unclear if this risk applies to those younger than 50 years without cardiovascular risk factors or the UC population. In the current study of UC patients, the rate of a major cardiovascular event was rare (n = 4; IR, 0.2). In the short term, it may be prudent to restrict the 10-mg twice-daily dose to those who do not fall into the high-risk category, or try to reduce the dose to 5 mg twice daily if possible.

David A. Schwartz, MD, professor of medicine, division of gastroenterology, hepatology and nutrition, Inflammatory Bowel Disease Center, Vanderbilt University, Nashville.

Publications
Topics
Sections
Body

 

As new mechanisms of action become available for ulcerative colitis (UC) drugs, clinicians must weigh the risks versus benefits (i.e., safety vs. efficacy). In this article, Sandborn and colleagues provide additional information on the safety profile of tofacitinib. They report an increased risk of herpes zoster that was dose dependent (sixfold increase on 10 mg twice daily). The overall safety profile was reassuring, is similar to the rheumatoid arthritis population treated with tofacitinib, and is in line with the safety profile of anti-TNF antibodies (excluding the increase risk of zoster). With a nonlive zoster vaccine now available, some have advocated vaccinating all patients being started on tofacitinib. However, there is a theoretical risk of disease exacerbation and ongoing studies that will hopefully answer this important question.

Dr. David A. Schwartz
Another emerging safety concern with tofacitinib involves venous thromboembolism (VTE). The Food and Drug Administration recently issued a warning based on the findings of a safety trial in rheumatoid arthritis in which they found an increased risk of PE and death in those on 10-mg twice-daily dose. The exact details of the risk have yet to be released. Enrollment in the trial required patients aged over 50 years with at least one cardiovascular risk factor. The European regulatory body (EMA) recently forbade the use of the 10-mg dose of tofacitinib for anyone at increased risk for VTE. It is unclear if this risk applies to those younger than 50 years without cardiovascular risk factors or the UC population. In the current study of UC patients, the rate of a major cardiovascular event was rare (n = 4; IR, 0.2). In the short term, it may be prudent to restrict the 10-mg twice-daily dose to those who do not fall into the high-risk category, or try to reduce the dose to 5 mg twice daily if possible.

David A. Schwartz, MD, professor of medicine, division of gastroenterology, hepatology and nutrition, Inflammatory Bowel Disease Center, Vanderbilt University, Nashville.

Body

 

As new mechanisms of action become available for ulcerative colitis (UC) drugs, clinicians must weigh the risks versus benefits (i.e., safety vs. efficacy). In this article, Sandborn and colleagues provide additional information on the safety profile of tofacitinib. They report an increased risk of herpes zoster that was dose dependent (sixfold increase on 10 mg twice daily). The overall safety profile was reassuring, is similar to the rheumatoid arthritis population treated with tofacitinib, and is in line with the safety profile of anti-TNF antibodies (excluding the increase risk of zoster). With a nonlive zoster vaccine now available, some have advocated vaccinating all patients being started on tofacitinib. However, there is a theoretical risk of disease exacerbation and ongoing studies that will hopefully answer this important question.

Dr. David A. Schwartz
Another emerging safety concern with tofacitinib involves venous thromboembolism (VTE). The Food and Drug Administration recently issued a warning based on the findings of a safety trial in rheumatoid arthritis in which they found an increased risk of PE and death in those on 10-mg twice-daily dose. The exact details of the risk have yet to be released. Enrollment in the trial required patients aged over 50 years with at least one cardiovascular risk factor. The European regulatory body (EMA) recently forbade the use of the 10-mg dose of tofacitinib for anyone at increased risk for VTE. It is unclear if this risk applies to those younger than 50 years without cardiovascular risk factors or the UC population. In the current study of UC patients, the rate of a major cardiovascular event was rare (n = 4; IR, 0.2). In the short term, it may be prudent to restrict the 10-mg twice-daily dose to those who do not fall into the high-risk category, or try to reduce the dose to 5 mg twice daily if possible.

David A. Schwartz, MD, professor of medicine, division of gastroenterology, hepatology and nutrition, Inflammatory Bowel Disease Center, Vanderbilt University, Nashville.

Title
How safe is tofacitinib?
How safe is tofacitinib?

 

Among patients with moderate to severe ulcerative colitis, a median of 1.4 years and up to 4.4 years of tofacitinib therapy was safe apart from a dose-related increase in risk of herpes zoster infection, according to an integrated analysis of data from five clinical trials.

clsgraphics/iStockphoto

Compared with placebo, a 5-mg twice-daily maintenance dose of tofacitinib (Xeljanz) produced a 2.1-fold greater risk of herpes zoster infection (95% confidence interval, 0.4-6.0), while a 10-mg, twice-daily dose produced a statistically significant 6.6-fold increase in incidence (95% CI, 3.2-12.2).

With the exception of the higher incidence rate of herpes zoster, “in the overall cohort, the safety profile of tofacitinib was generally similar to that of tumor necrosis factor inhibitor therapies,” wrote William J. Sandborn, MD, director of the inflammatory bowel disease center and professor of medicine, at the University of California, San Diego, and associates. The findings were published in Clinical Gastroenterology and Hepatology.

Tofacitinib is an oral, small-molecular Janus kinase inhibitor approved in the United States for treating moderate to severe ulcerative colitis, as well as rheumatoid and psoriatic arthritis. The recommended ulcerative colitis dose is 10 mg twice daily for at least 8 weeks (induction therapy) followed by 5 or 10 mg twice daily (maintenance). The safety of tofacitinib has been studied in patients with rheumatoid arthritis through 9 years of treatment. To begin a similar undertaking in ulcerative colitis, Dr. Sandborn and associates pooled data from three 8-week, double-blind, placebo-controlled induction trials, as well as one 52-week, double-blind, placebo-controlled maintenance trial and one ongoing open-label trial. All patients received twice-daily tofacitinib (5 mg or 10 mg) or placebo.

Among 1,157 tofacitinib recipients in the pooled analysis, 84% received an average of 10 mg twice daily. For every 100 person-years of tofacitinib exposure, there were an estimated 2.0 serious infections, 1.3 opportunistic infections, 4.1 herpes zoster infections, 1.4 malignancies (including nonmelanoma skin cancer, which had an incidence of 0.7), 0.2 major adverse cardiovascular events, and 0.2 gastrointestinal perforations. The likelihood of these events did not increase with time on tofacitinib, the researchers said.

 

 


Worsening ulcerative colitis was the most common serious adverse event for patients who received both induction and maintenance therapy. For patients on maintenance therapy, only herpes zoster infection had a higher incidence than placebo, which reached statistical significance at the 10-mg dose. These safety findings resemble those in rheumatoid arthritis trials of tofacitinib, and apart from herpes zoster, they also resemble safety data for vedolizumab (an integrin receptor antagonist), and anti-tumor necrosis factor agents in ulcerative colitis, the researchers wrote.

There were four deaths during the entire tofacitinib ulcerative colitis program, for an incidence rate of 0.2 per 100 person-years of exposure. All occurred in patients receiving 10 mg twice daily. Causes of death were dissecting aortic aneurysm, hepatic angiosarcoma, acute myeloid leukemia, and pulmonary embolism in a patient with cholangiocarcinoma that had metastasized to the peritoneum. Recently, concerns about pulmonary embolism have led the European Medicines Agency (EMA) to recommend against the use of 10-mg twice daily tofacitinib dose in patients at increased risk for pulmonary embolism.

“Compared with prior experience with tofacitinib in rheumatoid arthritis, no new or unexpected safety signals were identified,” the researchers concluded. “These safety findings support the long-term use of tofacitinib 5 and 10 mg twice daily in patients with moderately to severely active” ulcerative colitis.

Pfizer makes tofacitinib, funded the individual trials, and paid for medical writing. Dr. Sandborn disclosed grants, personal fees, and nonfinancial support from Pfizer and many other pharmaceutical companies.

SOURCE: Sandborn WJ et al. Clin Gastroenterol Hepatol. 2018 Nov 23. doi: 10.1016/j.cgh.2018.11.035.

 

Among patients with moderate to severe ulcerative colitis, a median of 1.4 years and up to 4.4 years of tofacitinib therapy was safe apart from a dose-related increase in risk of herpes zoster infection, according to an integrated analysis of data from five clinical trials.

clsgraphics/iStockphoto

Compared with placebo, a 5-mg twice-daily maintenance dose of tofacitinib (Xeljanz) produced a 2.1-fold greater risk of herpes zoster infection (95% confidence interval, 0.4-6.0), while a 10-mg, twice-daily dose produced a statistically significant 6.6-fold increase in incidence (95% CI, 3.2-12.2).

With the exception of the higher incidence rate of herpes zoster, “in the overall cohort, the safety profile of tofacitinib was generally similar to that of tumor necrosis factor inhibitor therapies,” wrote William J. Sandborn, MD, director of the inflammatory bowel disease center and professor of medicine, at the University of California, San Diego, and associates. The findings were published in Clinical Gastroenterology and Hepatology.

Tofacitinib is an oral, small-molecular Janus kinase inhibitor approved in the United States for treating moderate to severe ulcerative colitis, as well as rheumatoid and psoriatic arthritis. The recommended ulcerative colitis dose is 10 mg twice daily for at least 8 weeks (induction therapy) followed by 5 or 10 mg twice daily (maintenance). The safety of tofacitinib has been studied in patients with rheumatoid arthritis through 9 years of treatment. To begin a similar undertaking in ulcerative colitis, Dr. Sandborn and associates pooled data from three 8-week, double-blind, placebo-controlled induction trials, as well as one 52-week, double-blind, placebo-controlled maintenance trial and one ongoing open-label trial. All patients received twice-daily tofacitinib (5 mg or 10 mg) or placebo.

Among 1,157 tofacitinib recipients in the pooled analysis, 84% received an average of 10 mg twice daily. For every 100 person-years of tofacitinib exposure, there were an estimated 2.0 serious infections, 1.3 opportunistic infections, 4.1 herpes zoster infections, 1.4 malignancies (including nonmelanoma skin cancer, which had an incidence of 0.7), 0.2 major adverse cardiovascular events, and 0.2 gastrointestinal perforations. The likelihood of these events did not increase with time on tofacitinib, the researchers said.

 

 


Worsening ulcerative colitis was the most common serious adverse event for patients who received both induction and maintenance therapy. For patients on maintenance therapy, only herpes zoster infection had a higher incidence than placebo, which reached statistical significance at the 10-mg dose. These safety findings resemble those in rheumatoid arthritis trials of tofacitinib, and apart from herpes zoster, they also resemble safety data for vedolizumab (an integrin receptor antagonist), and anti-tumor necrosis factor agents in ulcerative colitis, the researchers wrote.

There were four deaths during the entire tofacitinib ulcerative colitis program, for an incidence rate of 0.2 per 100 person-years of exposure. All occurred in patients receiving 10 mg twice daily. Causes of death were dissecting aortic aneurysm, hepatic angiosarcoma, acute myeloid leukemia, and pulmonary embolism in a patient with cholangiocarcinoma that had metastasized to the peritoneum. Recently, concerns about pulmonary embolism have led the European Medicines Agency (EMA) to recommend against the use of 10-mg twice daily tofacitinib dose in patients at increased risk for pulmonary embolism.

“Compared with prior experience with tofacitinib in rheumatoid arthritis, no new or unexpected safety signals were identified,” the researchers concluded. “These safety findings support the long-term use of tofacitinib 5 and 10 mg twice daily in patients with moderately to severely active” ulcerative colitis.

Pfizer makes tofacitinib, funded the individual trials, and paid for medical writing. Dr. Sandborn disclosed grants, personal fees, and nonfinancial support from Pfizer and many other pharmaceutical companies.

SOURCE: Sandborn WJ et al. Clin Gastroenterol Hepatol. 2018 Nov 23. doi: 10.1016/j.cgh.2018.11.035.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Tofacitinib therapy shows a dose-related increase in risk of herpes zoster in patients with ulcerative colitis.

Major finding: Compared with placebo, a 5-mg twice-daily maintenance dose of tofacitinib produced a 2.1-fold greater risk of herpes zoster infection (95% CI, 0.4-6.0), while a 10-mg twice-daily dose produced a statistically significant 6.6-fold increase in incidence (95% CI, 3.2–12.2).

Study details: Integrated safety analysis of five clinical trials (four randomized, double-blinded, and placebo-controlled) with 1,612.8 total years of exposure (median treatment duration, 1.4 years).

Disclosures: Pfizer makes tofacitinib, funded the individual trials, and paid for medical writing. Dr. Sandborn disclosed grants, personal fees, and nonfinancial support from Pfizer and many other pharmaceutical companies.

Source: Sandborn WJ et al. Clin Gastroenterol Hepatol. 2018 Nov 23. doi: 10.1016/j.cgh.2018.11.035.

Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.