Estimated GFR, Albuminuria Predict Mortality Across All Age Groups

Elderly With Kidney Impairment at High Risk of Death
Article Type
Changed
Display Headline
Estimated GFR, Albuminuria Predict Mortality Across All Age Groups

Kidney measures such as low estimated glomerular filtration rate and high albuminuria are strongly associated with mortality and end-stage renal disease across all age groups – even in the elderly, according to a collaborative meta-analysis reported online Oct. 30 in JAMA and presented simultaneously at Kidney Week.

The risk for chronic kidney disease, which in turn is closely allied with the risk for cardiovascular disease and all-cause mortality, typically is gauged by assessing estimated GFR (eGFR) and albuminuria levels. But there has been substantial controversy regarding the accuracy of these measures for predicting mortality and CKD risk in the elderly, because kidney function appears to decline markedly even in apparently healthy people as they age, said Dr. Stein I. Hallan and his associates in the Chronic Kidney Disease Prognosis Consortium.

Some experts even hold that reduced GFR might simply be part of the natural aging process and that the kidneys undergo an inevitable senescence, rendering "normal" markers of kidney function unusable in the elderly, said Dr. Hallan, of St. Olav University Hospital and the Norwegian University of Science and Technology, both in Trondheim, and his colleagues.

To examine whether aging modifies the usefulness of estimated GFR and albuminuria in assessing the risks for mortality and CKD, Dr. Hallan and his associates analyzed data from 46 different cohorts worldwide that included the entire adult age range (18-108 years). The Chronic Kidney Disease Prognosis Consortium includes data on 20 North American, 12 European, 10 Asian, 1 Australian, and 3 multinational cohorts comprising more than 2 million study subjects followed for a mean of 6 years.

Among the study cohorts, 26 involved people from the general population, 8 involved patients at high risk for vascular disease, and the remaining 12 involved patients with CKD.

During follow-up there were 112,325 deaths in the general population and the high-risk cohorts, as well as 9,037 deaths in the CKD cohorts. There were 2,766 end-stage renal disease (ESRD) events in the general population and high-risk cohorts, as well as 5,962 ESRD events in the CKD cohorts.

Both mortality risk and the risk of ESRD events strongly increased with decreasing GFR across all age groups, even though the study populations had widely divergent demographic and clinical characteristics, the investigators said. These risks declined with increasing age (JAMA 2012 Oct. 30 [doi: 10.1001/jama.2012.16817]).

This correlation remained robust when the data were adjusted to account for patient sex, race, history of cardiovascular disease, blood pressure, serum cholesterol levels, body mass index, smoking status, and diabetes status. For example, the adjusted hazard ratio for all-cause mortality in subjects with an eGFR of 45 (compared with 80) mL/min per 1.73 m2 was 3.50 in those aged 18-54 years, and declined with age to 1.35 in those aged at least 75 years.

The findings were similar for albuminuria levels, with high levels predicting mortality and ESRD events across all age groups.

"Although some variation in management of CKD should be considered by age, based on cost and benefits, with respect to risk of mortality and ESRD, our data support a common definition and staging of CKD based on eGFR and albuminuria for all age groups," they said.

These results contradict the concern "that CKD guidelines should be used with caution in older individuals and that low eGFR reflects only natural aging." They also support recommendations that CKD measures be added to mortality risk equations.

In addition, "the strong increase in mortality along with kidney measures at older ages suggests that older adults should not be left out from management strategies of CKD. Previous data show that low eGFR in the very old is associated with classical CKD complications like anemia, acidosis, hyperparathyroidism, and hyperphosphatemia," the researchers said.

This study was supported by a variety of government agencies, medical research councils, and industry sponsors. The 16 authors reported numerous ties to industry sources.

Body

The medical community should conclude from this important new data that older adults with impaired kidney function are at high risk of death.

Since their excess mortality usually takes the form of cardiovascular disease, all appropriate preventive efforts should be taken in this patient population, including lifestyle modifications, blood pressure–lowering medications, renin-angiotensin system inhibitors if proteinuria is present, and lipid-lowering medications.

Furthermore, more study should be undertaken to assess the effects of commonly used glucose-lowering therapies in elderly patients, who have generally been excluded from clinical trials.

Dr. Ian H. de Boer is at the Kidney Research Institute at the University of Washington, Seattle. He reported receiving research funding from Abbott Laboratories. These remarks were taken from his editorial accompanying Dr. Hallan’s report (JAMA 2012 Oct. 30 [doi: 20.2002/jama.2012.30761]).

Author and Disclosure Information

Publications
Topics
Legacy Keywords
low estimated glomerular filtration rate, high albuminuria, mortality, end-stage renal disease, elderly, JAMA, Kidney Week, chronic kidney disease, risk for cardiovascular disease, and all-cause mortality, GFR, eGFR, albuminuria levels, kidney function, Dr. Stein I. Hallan, Chronic Kidney Disease Prognosis Consortium,

Author and Disclosure Information

Author and Disclosure Information

Body

The medical community should conclude from this important new data that older adults with impaired kidney function are at high risk of death.

Since their excess mortality usually takes the form of cardiovascular disease, all appropriate preventive efforts should be taken in this patient population, including lifestyle modifications, blood pressure–lowering medications, renin-angiotensin system inhibitors if proteinuria is present, and lipid-lowering medications.

Furthermore, more study should be undertaken to assess the effects of commonly used glucose-lowering therapies in elderly patients, who have generally been excluded from clinical trials.

Dr. Ian H. de Boer is at the Kidney Research Institute at the University of Washington, Seattle. He reported receiving research funding from Abbott Laboratories. These remarks were taken from his editorial accompanying Dr. Hallan’s report (JAMA 2012 Oct. 30 [doi: 20.2002/jama.2012.30761]).

Body

The medical community should conclude from this important new data that older adults with impaired kidney function are at high risk of death.

Since their excess mortality usually takes the form of cardiovascular disease, all appropriate preventive efforts should be taken in this patient population, including lifestyle modifications, blood pressure–lowering medications, renin-angiotensin system inhibitors if proteinuria is present, and lipid-lowering medications.

Furthermore, more study should be undertaken to assess the effects of commonly used glucose-lowering therapies in elderly patients, who have generally been excluded from clinical trials.

Dr. Ian H. de Boer is at the Kidney Research Institute at the University of Washington, Seattle. He reported receiving research funding from Abbott Laboratories. These remarks were taken from his editorial accompanying Dr. Hallan’s report (JAMA 2012 Oct. 30 [doi: 20.2002/jama.2012.30761]).

Title
Elderly With Kidney Impairment at High Risk of Death
Elderly With Kidney Impairment at High Risk of Death

Kidney measures such as low estimated glomerular filtration rate and high albuminuria are strongly associated with mortality and end-stage renal disease across all age groups – even in the elderly, according to a collaborative meta-analysis reported online Oct. 30 in JAMA and presented simultaneously at Kidney Week.

The risk for chronic kidney disease, which in turn is closely allied with the risk for cardiovascular disease and all-cause mortality, typically is gauged by assessing estimated GFR (eGFR) and albuminuria levels. But there has been substantial controversy regarding the accuracy of these measures for predicting mortality and CKD risk in the elderly, because kidney function appears to decline markedly even in apparently healthy people as they age, said Dr. Stein I. Hallan and his associates in the Chronic Kidney Disease Prognosis Consortium.

Some experts even hold that reduced GFR might simply be part of the natural aging process and that the kidneys undergo an inevitable senescence, rendering "normal" markers of kidney function unusable in the elderly, said Dr. Hallan, of St. Olav University Hospital and the Norwegian University of Science and Technology, both in Trondheim, and his colleagues.

To examine whether aging modifies the usefulness of estimated GFR and albuminuria in assessing the risks for mortality and CKD, Dr. Hallan and his associates analyzed data from 46 different cohorts worldwide that included the entire adult age range (18-108 years). The Chronic Kidney Disease Prognosis Consortium includes data on 20 North American, 12 European, 10 Asian, 1 Australian, and 3 multinational cohorts comprising more than 2 million study subjects followed for a mean of 6 years.

Among the study cohorts, 26 involved people from the general population, 8 involved patients at high risk for vascular disease, and the remaining 12 involved patients with CKD.

During follow-up there were 112,325 deaths in the general population and the high-risk cohorts, as well as 9,037 deaths in the CKD cohorts. There were 2,766 end-stage renal disease (ESRD) events in the general population and high-risk cohorts, as well as 5,962 ESRD events in the CKD cohorts.

Both mortality risk and the risk of ESRD events strongly increased with decreasing GFR across all age groups, even though the study populations had widely divergent demographic and clinical characteristics, the investigators said. These risks declined with increasing age (JAMA 2012 Oct. 30 [doi: 10.1001/jama.2012.16817]).

This correlation remained robust when the data were adjusted to account for patient sex, race, history of cardiovascular disease, blood pressure, serum cholesterol levels, body mass index, smoking status, and diabetes status. For example, the adjusted hazard ratio for all-cause mortality in subjects with an eGFR of 45 (compared with 80) mL/min per 1.73 m2 was 3.50 in those aged 18-54 years, and declined with age to 1.35 in those aged at least 75 years.

The findings were similar for albuminuria levels, with high levels predicting mortality and ESRD events across all age groups.

"Although some variation in management of CKD should be considered by age, based on cost and benefits, with respect to risk of mortality and ESRD, our data support a common definition and staging of CKD based on eGFR and albuminuria for all age groups," they said.

These results contradict the concern "that CKD guidelines should be used with caution in older individuals and that low eGFR reflects only natural aging." They also support recommendations that CKD measures be added to mortality risk equations.

In addition, "the strong increase in mortality along with kidney measures at older ages suggests that older adults should not be left out from management strategies of CKD. Previous data show that low eGFR in the very old is associated with classical CKD complications like anemia, acidosis, hyperparathyroidism, and hyperphosphatemia," the researchers said.

This study was supported by a variety of government agencies, medical research councils, and industry sponsors. The 16 authors reported numerous ties to industry sources.

Kidney measures such as low estimated glomerular filtration rate and high albuminuria are strongly associated with mortality and end-stage renal disease across all age groups – even in the elderly, according to a collaborative meta-analysis reported online Oct. 30 in JAMA and presented simultaneously at Kidney Week.

The risk for chronic kidney disease, which in turn is closely allied with the risk for cardiovascular disease and all-cause mortality, typically is gauged by assessing estimated GFR (eGFR) and albuminuria levels. But there has been substantial controversy regarding the accuracy of these measures for predicting mortality and CKD risk in the elderly, because kidney function appears to decline markedly even in apparently healthy people as they age, said Dr. Stein I. Hallan and his associates in the Chronic Kidney Disease Prognosis Consortium.

Some experts even hold that reduced GFR might simply be part of the natural aging process and that the kidneys undergo an inevitable senescence, rendering "normal" markers of kidney function unusable in the elderly, said Dr. Hallan, of St. Olav University Hospital and the Norwegian University of Science and Technology, both in Trondheim, and his colleagues.

To examine whether aging modifies the usefulness of estimated GFR and albuminuria in assessing the risks for mortality and CKD, Dr. Hallan and his associates analyzed data from 46 different cohorts worldwide that included the entire adult age range (18-108 years). The Chronic Kidney Disease Prognosis Consortium includes data on 20 North American, 12 European, 10 Asian, 1 Australian, and 3 multinational cohorts comprising more than 2 million study subjects followed for a mean of 6 years.

Among the study cohorts, 26 involved people from the general population, 8 involved patients at high risk for vascular disease, and the remaining 12 involved patients with CKD.

During follow-up there were 112,325 deaths in the general population and the high-risk cohorts, as well as 9,037 deaths in the CKD cohorts. There were 2,766 end-stage renal disease (ESRD) events in the general population and high-risk cohorts, as well as 5,962 ESRD events in the CKD cohorts.

Both mortality risk and the risk of ESRD events strongly increased with decreasing GFR across all age groups, even though the study populations had widely divergent demographic and clinical characteristics, the investigators said. These risks declined with increasing age (JAMA 2012 Oct. 30 [doi: 10.1001/jama.2012.16817]).

This correlation remained robust when the data were adjusted to account for patient sex, race, history of cardiovascular disease, blood pressure, serum cholesterol levels, body mass index, smoking status, and diabetes status. For example, the adjusted hazard ratio for all-cause mortality in subjects with an eGFR of 45 (compared with 80) mL/min per 1.73 m2 was 3.50 in those aged 18-54 years, and declined with age to 1.35 in those aged at least 75 years.

The findings were similar for albuminuria levels, with high levels predicting mortality and ESRD events across all age groups.

"Although some variation in management of CKD should be considered by age, based on cost and benefits, with respect to risk of mortality and ESRD, our data support a common definition and staging of CKD based on eGFR and albuminuria for all age groups," they said.

These results contradict the concern "that CKD guidelines should be used with caution in older individuals and that low eGFR reflects only natural aging." They also support recommendations that CKD measures be added to mortality risk equations.

In addition, "the strong increase in mortality along with kidney measures at older ages suggests that older adults should not be left out from management strategies of CKD. Previous data show that low eGFR in the very old is associated with classical CKD complications like anemia, acidosis, hyperparathyroidism, and hyperphosphatemia," the researchers said.

This study was supported by a variety of government agencies, medical research councils, and industry sponsors. The 16 authors reported numerous ties to industry sources.

Publications
Publications
Topics
Article Type
Display Headline
Estimated GFR, Albuminuria Predict Mortality Across All Age Groups
Display Headline
Estimated GFR, Albuminuria Predict Mortality Across All Age Groups
Legacy Keywords
low estimated glomerular filtration rate, high albuminuria, mortality, end-stage renal disease, elderly, JAMA, Kidney Week, chronic kidney disease, risk for cardiovascular disease, and all-cause mortality, GFR, eGFR, albuminuria levels, kidney function, Dr. Stein I. Hallan, Chronic Kidney Disease Prognosis Consortium,

Legacy Keywords
low estimated glomerular filtration rate, high albuminuria, mortality, end-stage renal disease, elderly, JAMA, Kidney Week, chronic kidney disease, risk for cardiovascular disease, and all-cause mortality, GFR, eGFR, albuminuria levels, kidney function, Dr. Stein I. Hallan, Chronic Kidney Disease Prognosis Consortium,

Article Source

FROM JAMA

PURLs Copyright

Inside the Article

Vitals

Major Finding: The risks of both mortality and ESRD strongly correlated with low estimated GFR and high albuminuria across all age groups, including the elderly.

Data Source: This was a meta-analysis of 46 large cohorts involving over 2 million study subjects followed for a mean of 6 years, to assess a possible effect of patient age on risk prediction of mortality and ESRD.

Disclosures: This study was supported by a variety of government agencies, medical research councils, and industry sponsors. The 16 authors reported numerous ties to industry sources.

Early Surgery Yields Survival Benefit for Low-Grade Gliomas

A Few Limitations
Article Type
Changed
Display Headline
Early Surgery Yields Survival Benefit for Low-Grade Gliomas

Adults in Norway with diffuse low-grade gliomas who were treated at a hospital advocating early surgical resection had better overall survival than those treated at a hospital advocating "watchful waiting," according to a report published online Oct. 30 in JAMA.

This finding significantly strengthens the sparse evidence in support of early resection for newly diagnosed diffuse low-grade gliomas, said Dr. Asgeir S. Jakola of the department of neurosurgery, St. Olav’s University Hospital, Trondheim (Norway) and his associates.

Management of these tumors is one of the major controversies in both neurology and oncology today, largely because the effect of surgery on survival is still unclear. The only evidence available until now was based solely on uncontrolled surgical series; some of these have reported that it is safe to withhold surgery until the lesions progress, while others have reported that immediate resection improves survival and delays the time to malignant transformation.

Both patients and physicians are reluctant to undertake immediate surgery when the evidence supporting that strategy has been so tenuous. They also are concerned that the risk of early and aggressive surgery outweighs the benefit, particularly when most patients are capable of normal activity and have a reasonably long life expectancy at diagnosis, the investigators said.

It is unlikely that a randomized, controlled study comparing the two approaches will ever be performed. Dr. Jakola and his colleagues therefore conducted a retrospective, population-based parallel-cohort study at two neurosurgical centers, each of which preferred one of these strategies over the other. Their "natural experiment" was possible because in Norway, there were two such facilities that were relatively close geographically and served a homogenous population. The nationalized health care system distributes training, resources, and personnel equally throughout the country, so the two hospitals were quite similar in other respects. And patient follow-up is 100%.

The 12-year study involved 153 adults with diffuse, histologically verified supratentorial grade I and II tumors diagnosed in 1998-2009, who were followed until death or until April 2011. The median follow-up was 7 years. Gliomas included astrocytomas, oligodendrogliomas, and oligoastrocytomas.

For patients with newly diagnosed low-grade gliomas, hospital A favored biopsy and watchful waiting. The 66 patients treated there typically were followed with MRI at 3 and 6 months, then yearly thereafter. They usually were offered surgical resection, if the lesions grew or showed signs of malignant transformation.

Hospital B favored immediate maximal safe tumor resection for the 87 patients treated there, with MRI follow-up at 6 and 12 months, then annually thereafter. This strategy was not pursued in some patients, however: notably, those who were elderly or had comorbidities and were likely to die from another cause before malignant transformation would take place, and those who had very widespread tumor infiltration that made resection impractical.

The two study groups were well balanced with regard to patient age and comorbidities, and rates of surgical rescue therapy were the same. There also were no differences between the two groups in complications or acquired neurologic deficits.

At the end of the study period, 34 patients (52%) from hospital A had died, compared with only 28 patients (32%) from hospital B. Median survival was 5.9 years at hospital A, but median survival had not yet been reached at hospital B, the researchers said (JAMA 2012;308: [doi:10.1001/jama.2012.12807]).

This survival advantage increased over time. Expected 3-year survival was 70% at hospital A vs. 80% at hospital B; expected 5-year survival was 60% at hospital A vs. 74% at hospital B; and expected 7-year survival was 44% at hospital A vs. 68% at hospital B.

In a post hoc analysis that attempted to account for differences between the two study groups in prognostic factors, the survival benefit for immediate resection remained robust. It also remained robust in another post hoc analysis that examined the subgroup of patients who had the most common glioma, a grade II astrocytoma. Median survival was 5.6 years at the hospital favoring watchful waiting, compared with 9.7 years at the hospital favoring early resection, in this large subgroup of patients.

Based on these findings, hospital A has changed its preferred strategy from watchful waiting to early resection, Dr. Jakola and his associates said.

"Despite the clear survival advantage seen, clinical judgment is still necessary in individual patients with suspected low-grade glioma since results will depend on patient and disease characteristics together with surgical results in terms of resection grades and complication rates," they added.

One of Dr. Jakola’s associates reported holding stock in Sonowand, manufacturer of the 3-D ultrasound-based imaging system used in one of the study hospitals.

Body

This "natural experiment" may be the best source of evidence supporting early surgical resection that we’re likely to get, but the study by Dr. Jakola and his colleagues did have some limitations, said Dr. James M. Markert.

The confidence intervals around the point estimates for survival in both groups overlapped, which means the patients must be followed for a longer period to ensure that the confidence intervals eventually separate definitively. Also, one potentially important difference between the two study groups was not accounted for: the proportion of oligodendrogliomas, which are highly survivable, was higher at hospital B (19%) than at hospital A (9%).

In addition, radiation therapy was administered more often at the hospital favoring resection (43% of patients) than at the hospital favoring watchful waiting (29%), which may have affected survival rates. And although the authors reported no differences between the two groups in complications or neurologic deficits, "assessment methods were not delineated and the data were insufficient to reach a definitive conclusion," he noted.

Dr. Markert is in the division of neurosurgery at the University of Alabama at Birmingham. He reported ties to Catherex and Tocgen. These remarks were taken from his editorial accompanying Dr. Jakola’s report (JAMA 2012 Oct. 25 [doi:10.1001/jama.2012.14523]).

Author and Disclosure Information

Publications
Topics
Legacy Keywords
brain cancer, glioma, Dr. Asgeir S. Jakola, brain tumor
Author and Disclosure Information

Author and Disclosure Information

Body

This "natural experiment" may be the best source of evidence supporting early surgical resection that we’re likely to get, but the study by Dr. Jakola and his colleagues did have some limitations, said Dr. James M. Markert.

The confidence intervals around the point estimates for survival in both groups overlapped, which means the patients must be followed for a longer period to ensure that the confidence intervals eventually separate definitively. Also, one potentially important difference between the two study groups was not accounted for: the proportion of oligodendrogliomas, which are highly survivable, was higher at hospital B (19%) than at hospital A (9%).

In addition, radiation therapy was administered more often at the hospital favoring resection (43% of patients) than at the hospital favoring watchful waiting (29%), which may have affected survival rates. And although the authors reported no differences between the two groups in complications or neurologic deficits, "assessment methods were not delineated and the data were insufficient to reach a definitive conclusion," he noted.

Dr. Markert is in the division of neurosurgery at the University of Alabama at Birmingham. He reported ties to Catherex and Tocgen. These remarks were taken from his editorial accompanying Dr. Jakola’s report (JAMA 2012 Oct. 25 [doi:10.1001/jama.2012.14523]).

Body

This "natural experiment" may be the best source of evidence supporting early surgical resection that we’re likely to get, but the study by Dr. Jakola and his colleagues did have some limitations, said Dr. James M. Markert.

The confidence intervals around the point estimates for survival in both groups overlapped, which means the patients must be followed for a longer period to ensure that the confidence intervals eventually separate definitively. Also, one potentially important difference between the two study groups was not accounted for: the proportion of oligodendrogliomas, which are highly survivable, was higher at hospital B (19%) than at hospital A (9%).

In addition, radiation therapy was administered more often at the hospital favoring resection (43% of patients) than at the hospital favoring watchful waiting (29%), which may have affected survival rates. And although the authors reported no differences between the two groups in complications or neurologic deficits, "assessment methods were not delineated and the data were insufficient to reach a definitive conclusion," he noted.

Dr. Markert is in the division of neurosurgery at the University of Alabama at Birmingham. He reported ties to Catherex and Tocgen. These remarks were taken from his editorial accompanying Dr. Jakola’s report (JAMA 2012 Oct. 25 [doi:10.1001/jama.2012.14523]).

Title
A Few Limitations
A Few Limitations

Adults in Norway with diffuse low-grade gliomas who were treated at a hospital advocating early surgical resection had better overall survival than those treated at a hospital advocating "watchful waiting," according to a report published online Oct. 30 in JAMA.

This finding significantly strengthens the sparse evidence in support of early resection for newly diagnosed diffuse low-grade gliomas, said Dr. Asgeir S. Jakola of the department of neurosurgery, St. Olav’s University Hospital, Trondheim (Norway) and his associates.

Management of these tumors is one of the major controversies in both neurology and oncology today, largely because the effect of surgery on survival is still unclear. The only evidence available until now was based solely on uncontrolled surgical series; some of these have reported that it is safe to withhold surgery until the lesions progress, while others have reported that immediate resection improves survival and delays the time to malignant transformation.

Both patients and physicians are reluctant to undertake immediate surgery when the evidence supporting that strategy has been so tenuous. They also are concerned that the risk of early and aggressive surgery outweighs the benefit, particularly when most patients are capable of normal activity and have a reasonably long life expectancy at diagnosis, the investigators said.

It is unlikely that a randomized, controlled study comparing the two approaches will ever be performed. Dr. Jakola and his colleagues therefore conducted a retrospective, population-based parallel-cohort study at two neurosurgical centers, each of which preferred one of these strategies over the other. Their "natural experiment" was possible because in Norway, there were two such facilities that were relatively close geographically and served a homogenous population. The nationalized health care system distributes training, resources, and personnel equally throughout the country, so the two hospitals were quite similar in other respects. And patient follow-up is 100%.

The 12-year study involved 153 adults with diffuse, histologically verified supratentorial grade I and II tumors diagnosed in 1998-2009, who were followed until death or until April 2011. The median follow-up was 7 years. Gliomas included astrocytomas, oligodendrogliomas, and oligoastrocytomas.

For patients with newly diagnosed low-grade gliomas, hospital A favored biopsy and watchful waiting. The 66 patients treated there typically were followed with MRI at 3 and 6 months, then yearly thereafter. They usually were offered surgical resection, if the lesions grew or showed signs of malignant transformation.

Hospital B favored immediate maximal safe tumor resection for the 87 patients treated there, with MRI follow-up at 6 and 12 months, then annually thereafter. This strategy was not pursued in some patients, however: notably, those who were elderly or had comorbidities and were likely to die from another cause before malignant transformation would take place, and those who had very widespread tumor infiltration that made resection impractical.

The two study groups were well balanced with regard to patient age and comorbidities, and rates of surgical rescue therapy were the same. There also were no differences between the two groups in complications or acquired neurologic deficits.

At the end of the study period, 34 patients (52%) from hospital A had died, compared with only 28 patients (32%) from hospital B. Median survival was 5.9 years at hospital A, but median survival had not yet been reached at hospital B, the researchers said (JAMA 2012;308: [doi:10.1001/jama.2012.12807]).

This survival advantage increased over time. Expected 3-year survival was 70% at hospital A vs. 80% at hospital B; expected 5-year survival was 60% at hospital A vs. 74% at hospital B; and expected 7-year survival was 44% at hospital A vs. 68% at hospital B.

In a post hoc analysis that attempted to account for differences between the two study groups in prognostic factors, the survival benefit for immediate resection remained robust. It also remained robust in another post hoc analysis that examined the subgroup of patients who had the most common glioma, a grade II astrocytoma. Median survival was 5.6 years at the hospital favoring watchful waiting, compared with 9.7 years at the hospital favoring early resection, in this large subgroup of patients.

Based on these findings, hospital A has changed its preferred strategy from watchful waiting to early resection, Dr. Jakola and his associates said.

"Despite the clear survival advantage seen, clinical judgment is still necessary in individual patients with suspected low-grade glioma since results will depend on patient and disease characteristics together with surgical results in terms of resection grades and complication rates," they added.

One of Dr. Jakola’s associates reported holding stock in Sonowand, manufacturer of the 3-D ultrasound-based imaging system used in one of the study hospitals.

Adults in Norway with diffuse low-grade gliomas who were treated at a hospital advocating early surgical resection had better overall survival than those treated at a hospital advocating "watchful waiting," according to a report published online Oct. 30 in JAMA.

This finding significantly strengthens the sparse evidence in support of early resection for newly diagnosed diffuse low-grade gliomas, said Dr. Asgeir S. Jakola of the department of neurosurgery, St. Olav’s University Hospital, Trondheim (Norway) and his associates.

Management of these tumors is one of the major controversies in both neurology and oncology today, largely because the effect of surgery on survival is still unclear. The only evidence available until now was based solely on uncontrolled surgical series; some of these have reported that it is safe to withhold surgery until the lesions progress, while others have reported that immediate resection improves survival and delays the time to malignant transformation.

Both patients and physicians are reluctant to undertake immediate surgery when the evidence supporting that strategy has been so tenuous. They also are concerned that the risk of early and aggressive surgery outweighs the benefit, particularly when most patients are capable of normal activity and have a reasonably long life expectancy at diagnosis, the investigators said.

It is unlikely that a randomized, controlled study comparing the two approaches will ever be performed. Dr. Jakola and his colleagues therefore conducted a retrospective, population-based parallel-cohort study at two neurosurgical centers, each of which preferred one of these strategies over the other. Their "natural experiment" was possible because in Norway, there were two such facilities that were relatively close geographically and served a homogenous population. The nationalized health care system distributes training, resources, and personnel equally throughout the country, so the two hospitals were quite similar in other respects. And patient follow-up is 100%.

The 12-year study involved 153 adults with diffuse, histologically verified supratentorial grade I and II tumors diagnosed in 1998-2009, who were followed until death or until April 2011. The median follow-up was 7 years. Gliomas included astrocytomas, oligodendrogliomas, and oligoastrocytomas.

For patients with newly diagnosed low-grade gliomas, hospital A favored biopsy and watchful waiting. The 66 patients treated there typically were followed with MRI at 3 and 6 months, then yearly thereafter. They usually were offered surgical resection, if the lesions grew or showed signs of malignant transformation.

Hospital B favored immediate maximal safe tumor resection for the 87 patients treated there, with MRI follow-up at 6 and 12 months, then annually thereafter. This strategy was not pursued in some patients, however: notably, those who were elderly or had comorbidities and were likely to die from another cause before malignant transformation would take place, and those who had very widespread tumor infiltration that made resection impractical.

The two study groups were well balanced with regard to patient age and comorbidities, and rates of surgical rescue therapy were the same. There also were no differences between the two groups in complications or acquired neurologic deficits.

At the end of the study period, 34 patients (52%) from hospital A had died, compared with only 28 patients (32%) from hospital B. Median survival was 5.9 years at hospital A, but median survival had not yet been reached at hospital B, the researchers said (JAMA 2012;308: [doi:10.1001/jama.2012.12807]).

This survival advantage increased over time. Expected 3-year survival was 70% at hospital A vs. 80% at hospital B; expected 5-year survival was 60% at hospital A vs. 74% at hospital B; and expected 7-year survival was 44% at hospital A vs. 68% at hospital B.

In a post hoc analysis that attempted to account for differences between the two study groups in prognostic factors, the survival benefit for immediate resection remained robust. It also remained robust in another post hoc analysis that examined the subgroup of patients who had the most common glioma, a grade II astrocytoma. Median survival was 5.6 years at the hospital favoring watchful waiting, compared with 9.7 years at the hospital favoring early resection, in this large subgroup of patients.

Based on these findings, hospital A has changed its preferred strategy from watchful waiting to early resection, Dr. Jakola and his associates said.

"Despite the clear survival advantage seen, clinical judgment is still necessary in individual patients with suspected low-grade glioma since results will depend on patient and disease characteristics together with surgical results in terms of resection grades and complication rates," they added.

One of Dr. Jakola’s associates reported holding stock in Sonowand, manufacturer of the 3-D ultrasound-based imaging system used in one of the study hospitals.

Publications
Publications
Topics
Article Type
Display Headline
Early Surgery Yields Survival Benefit for Low-Grade Gliomas
Display Headline
Early Surgery Yields Survival Benefit for Low-Grade Gliomas
Legacy Keywords
brain cancer, glioma, Dr. Asgeir S. Jakola, brain tumor
Legacy Keywords
brain cancer, glioma, Dr. Asgeir S. Jakola, brain tumor
Article Source

FROM JAMA

PURLs Copyright

Inside the Article

Vitals

Major Finding: Overall mortality was 52% with watchful waiting and 32% with early resection; median survival was 5.9 years in the first group but has not yet been reached in the second group.

Data Source: Investigators compared survival rates in one hospital that advocated watchful waiting (66 patients) and another that advocated early resection (87 patients) for low-grade gliomas.

Disclosures: One of Dr. Jakola’s associates reported holding stock in Sonowand, manufacturer of the 3-D ultrasound-based imaging system used in one of the study hospitals.

Psychosis Symptoms Tied to Higher Suicide Risk in Adolescents

Article Type
Changed
Display Headline
Psychosis Symptoms Tied to Higher Suicide Risk in Adolescents

Symptoms of psychosis were associated with greatly increased risk for suicidal behavior in the general adolescent population as well as in adolescents who have nonpsychotic disorders, such as depression, attention-deficit/hyperactivity disorder, anxiety disorders, or obsessive compulsive disorder, according to two separate epidemiologic studies reported online Oct. 29 in the Archives of General Psychiatry.

Among adolescents in the general population, as well as the subgroup of adolescents who had nonpsychotic DSM-III diagnoses, those who reported suicidal ideation, suicide planning, or suicidal acts were 10 times more likely than those who did not to affirm on direct questioning that they had experienced psychotic symptoms – mainly auditory hallucinations, said Ian Kelleher, Ph.D., of the department of psychiatry, Royal College of Surgeons in Ireland, Dublin, and his associates.

"The immediate clinical relevance of these findings is that all patients presenting at risk for suicidal behavior should receive a thorough assessment of psychotic symptoms and not just a screening to rule out psychotic disorder," they noted.

Both primary care physicians and psychiatric clinicians must recognize that psychotic symptoms in a nonpsychotic patient signify a high suicide risk. "Research has shown that the largest increase in suicide risk in the general population occurs after there has already been contact with mental health services and that approximately half of patients who complete suicide [had] contact with primary care providers in the month preceding their death[s]," the investigators added.

Hallucinations and delusions, the classic symptoms of psychosis, are far more prevalent in the general population than are diagnosable psychotic disorders. They are "especially common in young people, with a meta-analysis of general population studies demonstrating a median prevalence of 17% in children aged 9-12 years and 7.5% in adolescents aged 13-17 years," the researchers wrote.

And psychosis is known to raise the risk of suicide dramatically. Yet no studies to date have examined the relationship between psychotic symptoms and suicidal behaviors among adolescents. Dr. Kelleher and his colleagues did so using data from two independent cross-sectional epidemiologic studies of the general Irish population.

The Adolescent Brain Development (ABD) study assessed the prevalence of psychotic symptoms among 1,131 students aged 11-13 years in 16 mainstream schools, representing more than half of the total school population in that age group. The Challenging Times (CT) study assessed the prevalence of psychiatric disorders among 743 students aged 13-15 years in eight mainstream schools.

For this study, Dr. Kelleher and his associates analyzed the results of in-depth diagnostic interviews for 212 subjects from the ABD study and 211 from the CT study, as well as the interview responses of their parents.

Overall, 22% of the ABD sample and 7% of the CT sample reported experiencing psychotic symptoms when they were specifically asked about them, almost all of them during the preceding year. "From our clinical experience, young people will rarely volunteer information on psychotic symptoms unless questioned directly about such experiences. Adolescents are usually willing to talk openly about their experiences, however, in response to direct but sensitive questioning."

Examples of such questions included: "Sometimes people when they are alone hear things or see things, and they’re not quite sure where they came from. Does that ever happen to you?" and "Was there ever a time when you thought that your imagination was playing tricks on you?"

In the two cohorts combined, 44 subjects reported suicidal ideation, 16 reported making specific suicidal plans, and 8 reported suicidal acts.

 

 

Adolescents in both studies who reported suicidal behavior were more than 10 times more likely than were those who did not report suicidal behavior to say that they had also experienced hallucinations or delusions, the investigators said (Arch. Gen. Psychiatry Oct. 29 [doi:10.1001/archgenpsychiatry.2012.164]).

"Strikingly, a majority of adolescents with suicidal plans or acts reported psychotic symptoms in both the ABD study (60%) and the CT (55%), studies," they noted.

The researchers also examined the link between psychotic symptoms and suicidal behavior in the subgroup of adolescents who had a diagnosable psychiatric disorder. These included major depressive disorder, adjustment disorder with depressed mood, ADHD, oppositional defiant disorder, conduct disorder, generalized anxiety disorder, social phobia, separation anxiety disorder, and OCD.

In these high-risk subjects, those who reported experiencing hallucinations or delusions were more than five times more likely to also report suicidal behavior than were subjects who had no psychotic symptoms.

Moreover, a further analysis of the data showed that adolescents who had psychotic symptoms were more likely to show the most serious suicidal behavior (planning and attempts) than the less serious suicidal ideation.

This study was not designed to examine the reasons for this robust association between psychotic symptoms and suicidal behavior, but there are several possible mechanisms.

"The most obvious is that hallucinations may direct the individual to harm or kill themselves." However, only one subject reported hearing a voice commanding him to do so, suggesting that the content of hallucinations is not to blame, at least not in this age group.

Indirect cognitive mechanisms may play a role. "Changes in the subjective sense of self, for example, are among the earliest recognizable symptoms of psychosis, and a sense of disintegration and fragmentation of the self resulting from intrusive voices or thoughts have been linked to suicidal thinking," Dr. Kelleher and his colleagues said.

Alternatively, "Bleuler’s concept of ‘the suicidal drive’ might not be just the most severe symptom of schizophrenia but the most severe symptom of a much broader psychosis phenotype made up of individuals in the general population who experience psychotic symptoms," they wrote.

It is also possible some common factor underlies both psychotic symptoms and suicidal behavior. For example, the symptoms may be a marker for deteriorating mental health, which in turn puts patients at high risk for suicide.

Traumatic experiences also might be an underlying factor. Adolescents who have experienced severe adverse events such as childhood physical or sexual abuse are known to be at increased risk for developing psychotic symptoms, and their psychological distress may also place them at high risk for suicidal behavior, the researchers said.

This study also could not delineate when psychotic symptoms arose in relation to suicidal behavior, because it was cross-sectional rather than longitudinal. "Further research with more temporal information will help to address this point," they added.

This study was funded by the European Community’s Seventh Framework Programme. No conflicts of interest were reported.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
psychosis suicide, psychosis symptoms, suicidal behavior, Adolescent Brain Development study, adolescents suicide
Author and Disclosure Information

Author and Disclosure Information

Symptoms of psychosis were associated with greatly increased risk for suicidal behavior in the general adolescent population as well as in adolescents who have nonpsychotic disorders, such as depression, attention-deficit/hyperactivity disorder, anxiety disorders, or obsessive compulsive disorder, according to two separate epidemiologic studies reported online Oct. 29 in the Archives of General Psychiatry.

Among adolescents in the general population, as well as the subgroup of adolescents who had nonpsychotic DSM-III diagnoses, those who reported suicidal ideation, suicide planning, or suicidal acts were 10 times more likely than those who did not to affirm on direct questioning that they had experienced psychotic symptoms – mainly auditory hallucinations, said Ian Kelleher, Ph.D., of the department of psychiatry, Royal College of Surgeons in Ireland, Dublin, and his associates.

"The immediate clinical relevance of these findings is that all patients presenting at risk for suicidal behavior should receive a thorough assessment of psychotic symptoms and not just a screening to rule out psychotic disorder," they noted.

Both primary care physicians and psychiatric clinicians must recognize that psychotic symptoms in a nonpsychotic patient signify a high suicide risk. "Research has shown that the largest increase in suicide risk in the general population occurs after there has already been contact with mental health services and that approximately half of patients who complete suicide [had] contact with primary care providers in the month preceding their death[s]," the investigators added.

Hallucinations and delusions, the classic symptoms of psychosis, are far more prevalent in the general population than are diagnosable psychotic disorders. They are "especially common in young people, with a meta-analysis of general population studies demonstrating a median prevalence of 17% in children aged 9-12 years and 7.5% in adolescents aged 13-17 years," the researchers wrote.

And psychosis is known to raise the risk of suicide dramatically. Yet no studies to date have examined the relationship between psychotic symptoms and suicidal behaviors among adolescents. Dr. Kelleher and his colleagues did so using data from two independent cross-sectional epidemiologic studies of the general Irish population.

The Adolescent Brain Development (ABD) study assessed the prevalence of psychotic symptoms among 1,131 students aged 11-13 years in 16 mainstream schools, representing more than half of the total school population in that age group. The Challenging Times (CT) study assessed the prevalence of psychiatric disorders among 743 students aged 13-15 years in eight mainstream schools.

For this study, Dr. Kelleher and his associates analyzed the results of in-depth diagnostic interviews for 212 subjects from the ABD study and 211 from the CT study, as well as the interview responses of their parents.

Overall, 22% of the ABD sample and 7% of the CT sample reported experiencing psychotic symptoms when they were specifically asked about them, almost all of them during the preceding year. "From our clinical experience, young people will rarely volunteer information on psychotic symptoms unless questioned directly about such experiences. Adolescents are usually willing to talk openly about their experiences, however, in response to direct but sensitive questioning."

Examples of such questions included: "Sometimes people when they are alone hear things or see things, and they’re not quite sure where they came from. Does that ever happen to you?" and "Was there ever a time when you thought that your imagination was playing tricks on you?"

In the two cohorts combined, 44 subjects reported suicidal ideation, 16 reported making specific suicidal plans, and 8 reported suicidal acts.

 

 

Adolescents in both studies who reported suicidal behavior were more than 10 times more likely than were those who did not report suicidal behavior to say that they had also experienced hallucinations or delusions, the investigators said (Arch. Gen. Psychiatry Oct. 29 [doi:10.1001/archgenpsychiatry.2012.164]).

"Strikingly, a majority of adolescents with suicidal plans or acts reported psychotic symptoms in both the ABD study (60%) and the CT (55%), studies," they noted.

The researchers also examined the link between psychotic symptoms and suicidal behavior in the subgroup of adolescents who had a diagnosable psychiatric disorder. These included major depressive disorder, adjustment disorder with depressed mood, ADHD, oppositional defiant disorder, conduct disorder, generalized anxiety disorder, social phobia, separation anxiety disorder, and OCD.

In these high-risk subjects, those who reported experiencing hallucinations or delusions were more than five times more likely to also report suicidal behavior than were subjects who had no psychotic symptoms.

Moreover, a further analysis of the data showed that adolescents who had psychotic symptoms were more likely to show the most serious suicidal behavior (planning and attempts) than the less serious suicidal ideation.

This study was not designed to examine the reasons for this robust association between psychotic symptoms and suicidal behavior, but there are several possible mechanisms.

"The most obvious is that hallucinations may direct the individual to harm or kill themselves." However, only one subject reported hearing a voice commanding him to do so, suggesting that the content of hallucinations is not to blame, at least not in this age group.

Indirect cognitive mechanisms may play a role. "Changes in the subjective sense of self, for example, are among the earliest recognizable symptoms of psychosis, and a sense of disintegration and fragmentation of the self resulting from intrusive voices or thoughts have been linked to suicidal thinking," Dr. Kelleher and his colleagues said.

Alternatively, "Bleuler’s concept of ‘the suicidal drive’ might not be just the most severe symptom of schizophrenia but the most severe symptom of a much broader psychosis phenotype made up of individuals in the general population who experience psychotic symptoms," they wrote.

It is also possible some common factor underlies both psychotic symptoms and suicidal behavior. For example, the symptoms may be a marker for deteriorating mental health, which in turn puts patients at high risk for suicide.

Traumatic experiences also might be an underlying factor. Adolescents who have experienced severe adverse events such as childhood physical or sexual abuse are known to be at increased risk for developing psychotic symptoms, and their psychological distress may also place them at high risk for suicidal behavior, the researchers said.

This study also could not delineate when psychotic symptoms arose in relation to suicidal behavior, because it was cross-sectional rather than longitudinal. "Further research with more temporal information will help to address this point," they added.

This study was funded by the European Community’s Seventh Framework Programme. No conflicts of interest were reported.

Symptoms of psychosis were associated with greatly increased risk for suicidal behavior in the general adolescent population as well as in adolescents who have nonpsychotic disorders, such as depression, attention-deficit/hyperactivity disorder, anxiety disorders, or obsessive compulsive disorder, according to two separate epidemiologic studies reported online Oct. 29 in the Archives of General Psychiatry.

Among adolescents in the general population, as well as the subgroup of adolescents who had nonpsychotic DSM-III diagnoses, those who reported suicidal ideation, suicide planning, or suicidal acts were 10 times more likely than those who did not to affirm on direct questioning that they had experienced psychotic symptoms – mainly auditory hallucinations, said Ian Kelleher, Ph.D., of the department of psychiatry, Royal College of Surgeons in Ireland, Dublin, and his associates.

"The immediate clinical relevance of these findings is that all patients presenting at risk for suicidal behavior should receive a thorough assessment of psychotic symptoms and not just a screening to rule out psychotic disorder," they noted.

Both primary care physicians and psychiatric clinicians must recognize that psychotic symptoms in a nonpsychotic patient signify a high suicide risk. "Research has shown that the largest increase in suicide risk in the general population occurs after there has already been contact with mental health services and that approximately half of patients who complete suicide [had] contact with primary care providers in the month preceding their death[s]," the investigators added.

Hallucinations and delusions, the classic symptoms of psychosis, are far more prevalent in the general population than are diagnosable psychotic disorders. They are "especially common in young people, with a meta-analysis of general population studies demonstrating a median prevalence of 17% in children aged 9-12 years and 7.5% in adolescents aged 13-17 years," the researchers wrote.

And psychosis is known to raise the risk of suicide dramatically. Yet no studies to date have examined the relationship between psychotic symptoms and suicidal behaviors among adolescents. Dr. Kelleher and his colleagues did so using data from two independent cross-sectional epidemiologic studies of the general Irish population.

The Adolescent Brain Development (ABD) study assessed the prevalence of psychotic symptoms among 1,131 students aged 11-13 years in 16 mainstream schools, representing more than half of the total school population in that age group. The Challenging Times (CT) study assessed the prevalence of psychiatric disorders among 743 students aged 13-15 years in eight mainstream schools.

For this study, Dr. Kelleher and his associates analyzed the results of in-depth diagnostic interviews for 212 subjects from the ABD study and 211 from the CT study, as well as the interview responses of their parents.

Overall, 22% of the ABD sample and 7% of the CT sample reported experiencing psychotic symptoms when they were specifically asked about them, almost all of them during the preceding year. "From our clinical experience, young people will rarely volunteer information on psychotic symptoms unless questioned directly about such experiences. Adolescents are usually willing to talk openly about their experiences, however, in response to direct but sensitive questioning."

Examples of such questions included: "Sometimes people when they are alone hear things or see things, and they’re not quite sure where they came from. Does that ever happen to you?" and "Was there ever a time when you thought that your imagination was playing tricks on you?"

In the two cohorts combined, 44 subjects reported suicidal ideation, 16 reported making specific suicidal plans, and 8 reported suicidal acts.

 

 

Adolescents in both studies who reported suicidal behavior were more than 10 times more likely than were those who did not report suicidal behavior to say that they had also experienced hallucinations or delusions, the investigators said (Arch. Gen. Psychiatry Oct. 29 [doi:10.1001/archgenpsychiatry.2012.164]).

"Strikingly, a majority of adolescents with suicidal plans or acts reported psychotic symptoms in both the ABD study (60%) and the CT (55%), studies," they noted.

The researchers also examined the link between psychotic symptoms and suicidal behavior in the subgroup of adolescents who had a diagnosable psychiatric disorder. These included major depressive disorder, adjustment disorder with depressed mood, ADHD, oppositional defiant disorder, conduct disorder, generalized anxiety disorder, social phobia, separation anxiety disorder, and OCD.

In these high-risk subjects, those who reported experiencing hallucinations or delusions were more than five times more likely to also report suicidal behavior than were subjects who had no psychotic symptoms.

Moreover, a further analysis of the data showed that adolescents who had psychotic symptoms were more likely to show the most serious suicidal behavior (planning and attempts) than the less serious suicidal ideation.

This study was not designed to examine the reasons for this robust association between psychotic symptoms and suicidal behavior, but there are several possible mechanisms.

"The most obvious is that hallucinations may direct the individual to harm or kill themselves." However, only one subject reported hearing a voice commanding him to do so, suggesting that the content of hallucinations is not to blame, at least not in this age group.

Indirect cognitive mechanisms may play a role. "Changes in the subjective sense of self, for example, are among the earliest recognizable symptoms of psychosis, and a sense of disintegration and fragmentation of the self resulting from intrusive voices or thoughts have been linked to suicidal thinking," Dr. Kelleher and his colleagues said.

Alternatively, "Bleuler’s concept of ‘the suicidal drive’ might not be just the most severe symptom of schizophrenia but the most severe symptom of a much broader psychosis phenotype made up of individuals in the general population who experience psychotic symptoms," they wrote.

It is also possible some common factor underlies both psychotic symptoms and suicidal behavior. For example, the symptoms may be a marker for deteriorating mental health, which in turn puts patients at high risk for suicide.

Traumatic experiences also might be an underlying factor. Adolescents who have experienced severe adverse events such as childhood physical or sexual abuse are known to be at increased risk for developing psychotic symptoms, and their psychological distress may also place them at high risk for suicidal behavior, the researchers said.

This study also could not delineate when psychotic symptoms arose in relation to suicidal behavior, because it was cross-sectional rather than longitudinal. "Further research with more temporal information will help to address this point," they added.

This study was funded by the European Community’s Seventh Framework Programme. No conflicts of interest were reported.

Publications
Publications
Topics
Article Type
Display Headline
Psychosis Symptoms Tied to Higher Suicide Risk in Adolescents
Display Headline
Psychosis Symptoms Tied to Higher Suicide Risk in Adolescents
Legacy Keywords
psychosis suicide, psychosis symptoms, suicidal behavior, Adolescent Brain Development study, adolescents suicide
Legacy Keywords
psychosis suicide, psychosis symptoms, suicidal behavior, Adolescent Brain Development study, adolescents suicide
Article Source

FROM ARCHIVES OF GENERAL PSYCHIATRY

PURLs Copyright

Inside the Article

Vitals

Major Finding: Adolescents in the general population who reported suicidal behaviors were 10 times more likely than were those who did not report suicidal behavior to say that they had also experienced hallucinations or delusions.

Data Source: An analysis of data collected in two independent, cross-sectional epidemiologic studies of psychopathology in a representative sample of 423 adolescents aged 11-15 years in the general Irish population.

Disclosures: This study was funded by the European Community’s Seventh Framework Programme. No conflicts of interest were reported.

MI Rate Declines After Smoke-Free Laws Enacted in Olmsted County

More Evidence of Health Benefits
Article Type
Changed
Display Headline
MI Rate Declines After Smoke-Free Laws Enacted in Olmsted County

The rate of myocardial infarction dropped by one-third after laws prohibiting smoking in public places and workplaces were enacted in Olmsted County, Minnesota, according to a report published online Oct. 29 in Archives of Internal Medicine.

Although this epidemiologic study could not establish causality, no other interventions during the study period could plausibly explain this community-wide reduction in the MI rate. And the only major MI risk factor that declined concurrently was the prevalence of smoking; rates of hypertension and hypercholesterolemia remained steady, and rates of diabetes and obesity increased, said Dr. Richard D. Hurt of the Nicotine Dependence Center and the department of internal medicine at the Mayo Clinic in Rochester, Minn., and his associates.

© milosluz/istockphoto.com
Several studies have shown that smoke-free laws lead to a reduction in myocardial infarctions.

"We believe that secondhand smoke should be considered a major risk factor for MI, joining family history, hypertension, hyperlipidemia, diabetes mellitus, and low physical activity. Hence, all clinicians should ascertain secondhand smoke exposure and promote the elimination of secondhand smoke exposure as part of their lifestyle recommendations," they noted.

"All people should avoid secondhand smoke exposure as much as possible, and those with [coronary heart disease] should have no exposure to secondhand smoke," the investigators added.

Several studies have documented declines in hospital admissions for MI after the implementation of smoke-free laws, and the Institute of Medicine has concluded that there is a causal relationship between smoking bans and reductions in acute coronary events. To more closely examine the magnitude of that risk reduction, Dr. Hurt and his colleagues analyzed data from the Rochester Epidemiology Project, in which all cases of MI and sudden cardiac death in a well-defined community were validated using rigorous epidemiologic criteria. This project "has a long track record (more than 50 years) of robust epidemiologic studies," they said.

In Olmsted County, restaurants were required to be smoke free as of Jan. 1, 2002; bars and workplaces were required to follow suit on Oct. 1, 2007. The researchers examined rates of MI and sudden cardiac death during the 18 months before and the 18 months following implementation of each ordinance.

During the entire study period, there were 717 incident MIs and 514 cases of sudden cardiac death.

The age- and sex-adjusted rate of MI dropped from 150.8/100,000 people before the laws were implemented to 100.7/100,000 afterward – a 34% decline, the investigators said (Arch. Intern. Med. 2012 [doi:10.1001/2013.jamainternmed.46]).

Similarly, there was a 17% decline in the incidence of sudden cardiac death during this period, which indicates a trend but does not constitute a statistically significant reduction.

Smoke-free legislation is effective not only because it decreases the amount of secondhand smoke to which nonsmokers are exposed, but also because it reduces the intensity of smoking among smokers, increases quit rates, and reduces the rate of taking up smoking in the first place, Dr. Hurt and his associates said.

Other research has demonstrated that as little as 30 minutes of exposure to secondhand smoke causes an abrupt and dramatic decrease in coronary artery flow velocity reserves and vascular injury that inhibits endothelial function. Exposure also has been associated with low HDL cholesterol levels, increased markers of inflammation, increased serum levels of fibrinogen and homocysteine, decreased antioxidant levels, and increased insulin resistance, they wrote.

Taken together, these findings indicate that physicians should "become advocates for effective tobacco control policies, such as increased taxes, graphic labeling, smoke-free workplaces, and marketing and advertising restrictions," the researchers said.

One limitation of this study was that the population of Olmsted County is predominantly white. Further studies are needed "in communities of more diverse racial and ethnic composition," Dr. Hurt and his colleagues said.

This study was supported by ClearWay Minnesota; the National Heart, Lung, and Blood Institute; and the National Institute on Aging. No financial conflicts of interest were reported.

Body

The evidence documenting positive health outcomes from smoking bans continues to grow, as more areas adopt smoke-free legislation.

Clinicians should now work on closing the loopholes in existing smoke-free policies and expanding those policies to include bans in multiunit housing, motor vehicles, casinos, and outdoor locations. Studies have shown that smoking bans enacted in multiunit housing not only reduce exposure to second-hand smoke, but also increase quit attempts in persons who generally have higher smoking prevalences, such as those with low socioeconomic status.

Sara Kalkhoran, M.D., is in the department of internal medicine; Pamela M. Ling, M.D., is at the Center for Tobacco Control Research and Education within the department of internal medicine at the University of California, San Francisco. These remarks were taken from their invited commentary accompanying Dr. Hurt’s report (Arch. Intern. Med. 2012 [doi10.1001/2013.jamainternmed.269]). They reported no financial conflicts of interest.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
myocardial infarction rate, smoke-free laws, smoke heart problems, smoke MI, MI rate, Olmsted County smoke
Sections
Author and Disclosure Information

Author and Disclosure Information

Body

The evidence documenting positive health outcomes from smoking bans continues to grow, as more areas adopt smoke-free legislation.

Clinicians should now work on closing the loopholes in existing smoke-free policies and expanding those policies to include bans in multiunit housing, motor vehicles, casinos, and outdoor locations. Studies have shown that smoking bans enacted in multiunit housing not only reduce exposure to second-hand smoke, but also increase quit attempts in persons who generally have higher smoking prevalences, such as those with low socioeconomic status.

Sara Kalkhoran, M.D., is in the department of internal medicine; Pamela M. Ling, M.D., is at the Center for Tobacco Control Research and Education within the department of internal medicine at the University of California, San Francisco. These remarks were taken from their invited commentary accompanying Dr. Hurt’s report (Arch. Intern. Med. 2012 [doi10.1001/2013.jamainternmed.269]). They reported no financial conflicts of interest.

Body

The evidence documenting positive health outcomes from smoking bans continues to grow, as more areas adopt smoke-free legislation.

Clinicians should now work on closing the loopholes in existing smoke-free policies and expanding those policies to include bans in multiunit housing, motor vehicles, casinos, and outdoor locations. Studies have shown that smoking bans enacted in multiunit housing not only reduce exposure to second-hand smoke, but also increase quit attempts in persons who generally have higher smoking prevalences, such as those with low socioeconomic status.

Sara Kalkhoran, M.D., is in the department of internal medicine; Pamela M. Ling, M.D., is at the Center for Tobacco Control Research and Education within the department of internal medicine at the University of California, San Francisco. These remarks were taken from their invited commentary accompanying Dr. Hurt’s report (Arch. Intern. Med. 2012 [doi10.1001/2013.jamainternmed.269]). They reported no financial conflicts of interest.

Title
More Evidence of Health Benefits
More Evidence of Health Benefits

The rate of myocardial infarction dropped by one-third after laws prohibiting smoking in public places and workplaces were enacted in Olmsted County, Minnesota, according to a report published online Oct. 29 in Archives of Internal Medicine.

Although this epidemiologic study could not establish causality, no other interventions during the study period could plausibly explain this community-wide reduction in the MI rate. And the only major MI risk factor that declined concurrently was the prevalence of smoking; rates of hypertension and hypercholesterolemia remained steady, and rates of diabetes and obesity increased, said Dr. Richard D. Hurt of the Nicotine Dependence Center and the department of internal medicine at the Mayo Clinic in Rochester, Minn., and his associates.

© milosluz/istockphoto.com
Several studies have shown that smoke-free laws lead to a reduction in myocardial infarctions.

"We believe that secondhand smoke should be considered a major risk factor for MI, joining family history, hypertension, hyperlipidemia, diabetes mellitus, and low physical activity. Hence, all clinicians should ascertain secondhand smoke exposure and promote the elimination of secondhand smoke exposure as part of their lifestyle recommendations," they noted.

"All people should avoid secondhand smoke exposure as much as possible, and those with [coronary heart disease] should have no exposure to secondhand smoke," the investigators added.

Several studies have documented declines in hospital admissions for MI after the implementation of smoke-free laws, and the Institute of Medicine has concluded that there is a causal relationship between smoking bans and reductions in acute coronary events. To more closely examine the magnitude of that risk reduction, Dr. Hurt and his colleagues analyzed data from the Rochester Epidemiology Project, in which all cases of MI and sudden cardiac death in a well-defined community were validated using rigorous epidemiologic criteria. This project "has a long track record (more than 50 years) of robust epidemiologic studies," they said.

In Olmsted County, restaurants were required to be smoke free as of Jan. 1, 2002; bars and workplaces were required to follow suit on Oct. 1, 2007. The researchers examined rates of MI and sudden cardiac death during the 18 months before and the 18 months following implementation of each ordinance.

During the entire study period, there were 717 incident MIs and 514 cases of sudden cardiac death.

The age- and sex-adjusted rate of MI dropped from 150.8/100,000 people before the laws were implemented to 100.7/100,000 afterward – a 34% decline, the investigators said (Arch. Intern. Med. 2012 [doi:10.1001/2013.jamainternmed.46]).

Similarly, there was a 17% decline in the incidence of sudden cardiac death during this period, which indicates a trend but does not constitute a statistically significant reduction.

Smoke-free legislation is effective not only because it decreases the amount of secondhand smoke to which nonsmokers are exposed, but also because it reduces the intensity of smoking among smokers, increases quit rates, and reduces the rate of taking up smoking in the first place, Dr. Hurt and his associates said.

Other research has demonstrated that as little as 30 minutes of exposure to secondhand smoke causes an abrupt and dramatic decrease in coronary artery flow velocity reserves and vascular injury that inhibits endothelial function. Exposure also has been associated with low HDL cholesterol levels, increased markers of inflammation, increased serum levels of fibrinogen and homocysteine, decreased antioxidant levels, and increased insulin resistance, they wrote.

Taken together, these findings indicate that physicians should "become advocates for effective tobacco control policies, such as increased taxes, graphic labeling, smoke-free workplaces, and marketing and advertising restrictions," the researchers said.

One limitation of this study was that the population of Olmsted County is predominantly white. Further studies are needed "in communities of more diverse racial and ethnic composition," Dr. Hurt and his colleagues said.

This study was supported by ClearWay Minnesota; the National Heart, Lung, and Blood Institute; and the National Institute on Aging. No financial conflicts of interest were reported.

The rate of myocardial infarction dropped by one-third after laws prohibiting smoking in public places and workplaces were enacted in Olmsted County, Minnesota, according to a report published online Oct. 29 in Archives of Internal Medicine.

Although this epidemiologic study could not establish causality, no other interventions during the study period could plausibly explain this community-wide reduction in the MI rate. And the only major MI risk factor that declined concurrently was the prevalence of smoking; rates of hypertension and hypercholesterolemia remained steady, and rates of diabetes and obesity increased, said Dr. Richard D. Hurt of the Nicotine Dependence Center and the department of internal medicine at the Mayo Clinic in Rochester, Minn., and his associates.

© milosluz/istockphoto.com
Several studies have shown that smoke-free laws lead to a reduction in myocardial infarctions.

"We believe that secondhand smoke should be considered a major risk factor for MI, joining family history, hypertension, hyperlipidemia, diabetes mellitus, and low physical activity. Hence, all clinicians should ascertain secondhand smoke exposure and promote the elimination of secondhand smoke exposure as part of their lifestyle recommendations," they noted.

"All people should avoid secondhand smoke exposure as much as possible, and those with [coronary heart disease] should have no exposure to secondhand smoke," the investigators added.

Several studies have documented declines in hospital admissions for MI after the implementation of smoke-free laws, and the Institute of Medicine has concluded that there is a causal relationship between smoking bans and reductions in acute coronary events. To more closely examine the magnitude of that risk reduction, Dr. Hurt and his colleagues analyzed data from the Rochester Epidemiology Project, in which all cases of MI and sudden cardiac death in a well-defined community were validated using rigorous epidemiologic criteria. This project "has a long track record (more than 50 years) of robust epidemiologic studies," they said.

In Olmsted County, restaurants were required to be smoke free as of Jan. 1, 2002; bars and workplaces were required to follow suit on Oct. 1, 2007. The researchers examined rates of MI and sudden cardiac death during the 18 months before and the 18 months following implementation of each ordinance.

During the entire study period, there were 717 incident MIs and 514 cases of sudden cardiac death.

The age- and sex-adjusted rate of MI dropped from 150.8/100,000 people before the laws were implemented to 100.7/100,000 afterward – a 34% decline, the investigators said (Arch. Intern. Med. 2012 [doi:10.1001/2013.jamainternmed.46]).

Similarly, there was a 17% decline in the incidence of sudden cardiac death during this period, which indicates a trend but does not constitute a statistically significant reduction.

Smoke-free legislation is effective not only because it decreases the amount of secondhand smoke to which nonsmokers are exposed, but also because it reduces the intensity of smoking among smokers, increases quit rates, and reduces the rate of taking up smoking in the first place, Dr. Hurt and his associates said.

Other research has demonstrated that as little as 30 minutes of exposure to secondhand smoke causes an abrupt and dramatic decrease in coronary artery flow velocity reserves and vascular injury that inhibits endothelial function. Exposure also has been associated with low HDL cholesterol levels, increased markers of inflammation, increased serum levels of fibrinogen and homocysteine, decreased antioxidant levels, and increased insulin resistance, they wrote.

Taken together, these findings indicate that physicians should "become advocates for effective tobacco control policies, such as increased taxes, graphic labeling, smoke-free workplaces, and marketing and advertising restrictions," the researchers said.

One limitation of this study was that the population of Olmsted County is predominantly white. Further studies are needed "in communities of more diverse racial and ethnic composition," Dr. Hurt and his colleagues said.

This study was supported by ClearWay Minnesota; the National Heart, Lung, and Blood Institute; and the National Institute on Aging. No financial conflicts of interest were reported.

Publications
Publications
Topics
Article Type
Display Headline
MI Rate Declines After Smoke-Free Laws Enacted in Olmsted County
Display Headline
MI Rate Declines After Smoke-Free Laws Enacted in Olmsted County
Legacy Keywords
myocardial infarction rate, smoke-free laws, smoke heart problems, smoke MI, MI rate, Olmsted County smoke
Legacy Keywords
myocardial infarction rate, smoke-free laws, smoke heart problems, smoke MI, MI rate, Olmsted County smoke
Sections
Article Source

FROM ARCHIVES OF INTERNAL MEDICINE

PURLs Copyright

Inside the Article

Vitals

Major Finding: The rate of incident MI dropped 34%, from 150.8/100,000 people to 100.7/100,000, after laws prohibiting smoking in public places and workplaces were enacted.

Data Source: An analysis of data in the Rochester Epidemiology Project concerning the incidence of MI and sudden cardiac death in Olmsted County, Minn., during the 18 months before and the 18 months after smoke-free legislation was enacted.

Disclosures: This study was supported by ClearWay Minnesota; the National Heart, Lung, and Blood Institute; and the National Institute on Aging. No financial conflicts of interest were reported.

Crohn's Responded to Ustekinumab

Article Type
Changed
Display Headline
Crohn's Responded to Ustekinumab

Ustekinumab induced a clinical response in patients with moderate to severe Crohn’s disease that was resistant to tumor necrosis factor antagonists, in a phase IIb clinical trial published online Oct. 17 in the New England Journal of Medicine.

However, the agent did not improve remission rates, compared with placebo, said Dr. William J. Sandborn, AGAF, who is professor of medicine and chief of the division of gastroenterology at the University of California San Diego, La Jolla, and his associates.

Dr. William J. Sandborn

"A sizable proportion" of patients with moderate to severe Crohn’s disease do not respond to TNF antagonists, have an unsustained response, or must discontinue the medications because of adverse effects. After ustekinumab showed efficacy in such patients in a phase IIa clinical study, Dr. Sandborn and his colleagues performed a 36-week double-blind phase IIb trial in 526 adults at 153 medical centers in 12 countries.

Ustekinumab, a human IgG monoclonal antibody that inhibits the receptors for interleukin-12 and interleukin-23 on T cells, natural killer cells, and antigen-presenting cells, has Food and Drug Administration approval for use in plaque psoriasis. This clinical trial was sponsored by an affiliate of the manufacturer, Janssen Biotech.

During an 8-week induction phase, the study subjects were randomly assigned to receive intravenous placebo (132 patients) or ustekinumab in 1-mg/kg (131 patients), 3-mg/kg (132 patients), or 6-mg/kg (131 patients) doses. Then, during weeks 8-36, the study subjects who showed a response to induction therapy and those who did not show a response were separately randomized to receive either subcutaneous ustekinumab (90 mg) or placebo at week 8 and week 16, as maintenance therapy.

Treatment efficacy was assessed at week 22, and patients were followed through week 36 for a safety analysis. A total of 36.1% of the subjects discontinued the study before week 36.

The primary end point was a clinical response, defined as a decrease of 100 points or more on the Crohn’s Disease Activity Index (CDAI) score.

A total of 39.7% of patients receiving the 6-mg induction dose showed a clinical response, which was significantly greater than the 23.5% of patients receiving placebo, the investigators said (N. Engl. J. Med. 2012 [doi:10.1056/NEJMoa1203572]).

A greater number of patients receiving the lower doses of ustekinumab than receiving placebo showed a clinical response, but the differences between these low-dose groups and the placebo group did not reach statistical significance.

The 6-mg/kg dose was effective across most demographic and disease characteristics, judging from the findings of a subgroup analysis. It was consistently effective in patients who had failed on their first attempt at therapy with TNF antagonists, patients who had failed on two or more TNF antagonists, and patients who had only a transient response to TNF antagonists.

However, rates of clinical remission did not differ significantly between patients receiving ustekinumab and those receiving placebo, Dr. Sandborn and his associates said.

At all follow-up visits, the proportion of patients who had a 70-point clinical response was significantly higher, the reductions in mean CDAI scores were significantly greater, and the reductions in C-reactive protein levels were significantly greater in patients receiving 6 mg per kg of ustekinumab than in the placebo group.

As a maintenance therapy, 90 mg of subcutaneous ustekinumab appeared to be effective in patients who responded to the induction dose of the agent. The proportion of patients who showed a clinical response at week 22 was 69.4% in those receiving maintenance ustekinumab, significantly greater than the 42.5% response rate among those receiving maintenance placebo.

Among patients who responded to induction-phase ustekinumab, 41.7% of those who also received maintenance ustekinumab achieved clinical remission at week 22, compared with only 27.4% of those who received maintenance placebo.

Similarly, among patients who showed a response to induction ustekinumab, reductions in both CDAI scores and CRP levels were sustained if they continued on maintenance ustekinumab but were not sustained if they continued on placebo for the maintenance period.

However, patients who did not show a response to induction ustekinumab also did not benefit from additional ustekinumab in the maintenance phase of the study.

The results of the safety analysis were "somewhat limited" by the small sample size and the short duration of treatment. No deaths, serious opportunistic infections, or major adverse cardiovascular events were reported, "but large studies of longer duration are needed to assess uncommon adverse events," the investigators said.

Of note, one patient receiving ustekinumab as both induction and maintenance therapy developed a basal cell carcinoma. Among patients taking ustekinumab in the induction phase of the study, six developed serious infections: Clostridium difficile, viral gastroenteritis, UTI, anal abscess, vaginal abscess, and a staph infection of a central catheter.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
ustekinumab Crohn's disease, tumor necrosis factor antagonists
Author and Disclosure Information

Author and Disclosure Information

Ustekinumab induced a clinical response in patients with moderate to severe Crohn’s disease that was resistant to tumor necrosis factor antagonists, in a phase IIb clinical trial published online Oct. 17 in the New England Journal of Medicine.

However, the agent did not improve remission rates, compared with placebo, said Dr. William J. Sandborn, AGAF, who is professor of medicine and chief of the division of gastroenterology at the University of California San Diego, La Jolla, and his associates.

Dr. William J. Sandborn

"A sizable proportion" of patients with moderate to severe Crohn’s disease do not respond to TNF antagonists, have an unsustained response, or must discontinue the medications because of adverse effects. After ustekinumab showed efficacy in such patients in a phase IIa clinical study, Dr. Sandborn and his colleagues performed a 36-week double-blind phase IIb trial in 526 adults at 153 medical centers in 12 countries.

Ustekinumab, a human IgG monoclonal antibody that inhibits the receptors for interleukin-12 and interleukin-23 on T cells, natural killer cells, and antigen-presenting cells, has Food and Drug Administration approval for use in plaque psoriasis. This clinical trial was sponsored by an affiliate of the manufacturer, Janssen Biotech.

During an 8-week induction phase, the study subjects were randomly assigned to receive intravenous placebo (132 patients) or ustekinumab in 1-mg/kg (131 patients), 3-mg/kg (132 patients), or 6-mg/kg (131 patients) doses. Then, during weeks 8-36, the study subjects who showed a response to induction therapy and those who did not show a response were separately randomized to receive either subcutaneous ustekinumab (90 mg) or placebo at week 8 and week 16, as maintenance therapy.

Treatment efficacy was assessed at week 22, and patients were followed through week 36 for a safety analysis. A total of 36.1% of the subjects discontinued the study before week 36.

The primary end point was a clinical response, defined as a decrease of 100 points or more on the Crohn’s Disease Activity Index (CDAI) score.

A total of 39.7% of patients receiving the 6-mg induction dose showed a clinical response, which was significantly greater than the 23.5% of patients receiving placebo, the investigators said (N. Engl. J. Med. 2012 [doi:10.1056/NEJMoa1203572]).

A greater number of patients receiving the lower doses of ustekinumab than receiving placebo showed a clinical response, but the differences between these low-dose groups and the placebo group did not reach statistical significance.

The 6-mg/kg dose was effective across most demographic and disease characteristics, judging from the findings of a subgroup analysis. It was consistently effective in patients who had failed on their first attempt at therapy with TNF antagonists, patients who had failed on two or more TNF antagonists, and patients who had only a transient response to TNF antagonists.

However, rates of clinical remission did not differ significantly between patients receiving ustekinumab and those receiving placebo, Dr. Sandborn and his associates said.

At all follow-up visits, the proportion of patients who had a 70-point clinical response was significantly higher, the reductions in mean CDAI scores were significantly greater, and the reductions in C-reactive protein levels were significantly greater in patients receiving 6 mg per kg of ustekinumab than in the placebo group.

As a maintenance therapy, 90 mg of subcutaneous ustekinumab appeared to be effective in patients who responded to the induction dose of the agent. The proportion of patients who showed a clinical response at week 22 was 69.4% in those receiving maintenance ustekinumab, significantly greater than the 42.5% response rate among those receiving maintenance placebo.

Among patients who responded to induction-phase ustekinumab, 41.7% of those who also received maintenance ustekinumab achieved clinical remission at week 22, compared with only 27.4% of those who received maintenance placebo.

Similarly, among patients who showed a response to induction ustekinumab, reductions in both CDAI scores and CRP levels were sustained if they continued on maintenance ustekinumab but were not sustained if they continued on placebo for the maintenance period.

However, patients who did not show a response to induction ustekinumab also did not benefit from additional ustekinumab in the maintenance phase of the study.

The results of the safety analysis were "somewhat limited" by the small sample size and the short duration of treatment. No deaths, serious opportunistic infections, or major adverse cardiovascular events were reported, "but large studies of longer duration are needed to assess uncommon adverse events," the investigators said.

Of note, one patient receiving ustekinumab as both induction and maintenance therapy developed a basal cell carcinoma. Among patients taking ustekinumab in the induction phase of the study, six developed serious infections: Clostridium difficile, viral gastroenteritis, UTI, anal abscess, vaginal abscess, and a staph infection of a central catheter.

Ustekinumab induced a clinical response in patients with moderate to severe Crohn’s disease that was resistant to tumor necrosis factor antagonists, in a phase IIb clinical trial published online Oct. 17 in the New England Journal of Medicine.

However, the agent did not improve remission rates, compared with placebo, said Dr. William J. Sandborn, AGAF, who is professor of medicine and chief of the division of gastroenterology at the University of California San Diego, La Jolla, and his associates.

Dr. William J. Sandborn

"A sizable proportion" of patients with moderate to severe Crohn’s disease do not respond to TNF antagonists, have an unsustained response, or must discontinue the medications because of adverse effects. After ustekinumab showed efficacy in such patients in a phase IIa clinical study, Dr. Sandborn and his colleagues performed a 36-week double-blind phase IIb trial in 526 adults at 153 medical centers in 12 countries.

Ustekinumab, a human IgG monoclonal antibody that inhibits the receptors for interleukin-12 and interleukin-23 on T cells, natural killer cells, and antigen-presenting cells, has Food and Drug Administration approval for use in plaque psoriasis. This clinical trial was sponsored by an affiliate of the manufacturer, Janssen Biotech.

During an 8-week induction phase, the study subjects were randomly assigned to receive intravenous placebo (132 patients) or ustekinumab in 1-mg/kg (131 patients), 3-mg/kg (132 patients), or 6-mg/kg (131 patients) doses. Then, during weeks 8-36, the study subjects who showed a response to induction therapy and those who did not show a response were separately randomized to receive either subcutaneous ustekinumab (90 mg) or placebo at week 8 and week 16, as maintenance therapy.

Treatment efficacy was assessed at week 22, and patients were followed through week 36 for a safety analysis. A total of 36.1% of the subjects discontinued the study before week 36.

The primary end point was a clinical response, defined as a decrease of 100 points or more on the Crohn’s Disease Activity Index (CDAI) score.

A total of 39.7% of patients receiving the 6-mg induction dose showed a clinical response, which was significantly greater than the 23.5% of patients receiving placebo, the investigators said (N. Engl. J. Med. 2012 [doi:10.1056/NEJMoa1203572]).

A greater number of patients receiving the lower doses of ustekinumab than receiving placebo showed a clinical response, but the differences between these low-dose groups and the placebo group did not reach statistical significance.

The 6-mg/kg dose was effective across most demographic and disease characteristics, judging from the findings of a subgroup analysis. It was consistently effective in patients who had failed on their first attempt at therapy with TNF antagonists, patients who had failed on two or more TNF antagonists, and patients who had only a transient response to TNF antagonists.

However, rates of clinical remission did not differ significantly between patients receiving ustekinumab and those receiving placebo, Dr. Sandborn and his associates said.

At all follow-up visits, the proportion of patients who had a 70-point clinical response was significantly higher, the reductions in mean CDAI scores were significantly greater, and the reductions in C-reactive protein levels were significantly greater in patients receiving 6 mg per kg of ustekinumab than in the placebo group.

As a maintenance therapy, 90 mg of subcutaneous ustekinumab appeared to be effective in patients who responded to the induction dose of the agent. The proportion of patients who showed a clinical response at week 22 was 69.4% in those receiving maintenance ustekinumab, significantly greater than the 42.5% response rate among those receiving maintenance placebo.

Among patients who responded to induction-phase ustekinumab, 41.7% of those who also received maintenance ustekinumab achieved clinical remission at week 22, compared with only 27.4% of those who received maintenance placebo.

Similarly, among patients who showed a response to induction ustekinumab, reductions in both CDAI scores and CRP levels were sustained if they continued on maintenance ustekinumab but were not sustained if they continued on placebo for the maintenance period.

However, patients who did not show a response to induction ustekinumab also did not benefit from additional ustekinumab in the maintenance phase of the study.

The results of the safety analysis were "somewhat limited" by the small sample size and the short duration of treatment. No deaths, serious opportunistic infections, or major adverse cardiovascular events were reported, "but large studies of longer duration are needed to assess uncommon adverse events," the investigators said.

Of note, one patient receiving ustekinumab as both induction and maintenance therapy developed a basal cell carcinoma. Among patients taking ustekinumab in the induction phase of the study, six developed serious infections: Clostridium difficile, viral gastroenteritis, UTI, anal abscess, vaginal abscess, and a staph infection of a central catheter.

Publications
Publications
Topics
Article Type
Display Headline
Crohn's Responded to Ustekinumab
Display Headline
Crohn's Responded to Ustekinumab
Legacy Keywords
ustekinumab Crohn's disease, tumor necrosis factor antagonists
Legacy Keywords
ustekinumab Crohn's disease, tumor necrosis factor antagonists
Article Source

PURLs Copyright

Inside the Article

Vitals

Major Finding: Of patients with moderate to severe Crohn's disease who received ustekinumab (6 mg/kg), 39.7% showed a decrease of 100 points or more in CDAI score, compared with 23.5% of those who received placebo.

Data Source: The data come from a 36-week international phase IIb randomized clinical trial comparing 3 doses of ustekinumab with placebo in 526 adults who had refractory Crohn’ disease.

Disclosures: This study was sponsored by Janssen Research and Development; Janssen Biotech makes ustekinumab. Dr. Sandborn and his associates reported numerous ties to industry sources.

Best Lynch Screening Strategy Identified

Casting a Wide Net May Work Best
Article Type
Changed
Display Headline
Best Lynch Screening Strategy Identified

Universal testing of colorectal cancers for DNA mismatch repair genes is the strategy most likely to identify the approximately 1%-3% of patients who have Lynch syndrome, according to a report in the Oct. 17 issue of JAMA.

Correctly identifying the small subgroup of colorectal cancer (CRC) patients who have Lynch syndrome is crucial so that their presymptomatic relatives can be found and all the affected family members can consider preventive measures to limit their morbidity and mortality.

Dr. Leticia Moreira and her colleagues in the EPICOLON consortium, an international network of colon cancer cohorts, pooled data involving 10,206 probands to show that "universal tumor MMR testing followed by germline testing offers the highest sensitivity," compared with other strategies, "although the increase in the diagnostic yield is modest" (JAMA 2012;308:1555-65).

This research highlights the limitations of various methods for identifying the subgroup of colorectal cancer patients who have Lynch syndrome and should remind clinicians "that simply asking about a family history of CRC in a first-degree relative will miss the majority of patients with Lynch syndrome," because only 43% of patients with Lynch syndrome have such a family history, wrote Dr. Uri Ladabaum and Dr. James M. Ford of the gastrointestinal cancer prevention program and clinical cancer genetics program at Stanford (Calif.) University in an editorial accompanying the study (JAMA 2012;308:1581-3).

"The majority of CRC patients do not have Lynch syndrome. But in the haystack of patients with CRC, those with Lynch syndrome are more like large knitting needles than tiny sewing needles – and a systematic search can find them," they wrote.

Lynch syndrome, also known as hereditary nonpolyposis colorectal cancer, is the most common form of familial colorectal cancer and is caused by germline mutations in the DNA mismatch repair (MMR) genes MSH2, MLH1, MSH6, and PMS2. Dysfunction of these genes causes an accumulation of errors during DNA replication, particularly in the repetitive sequences called microsatellites.

"As a result, tumors of patients with Lynch syndrome characteristically demonstrate MMR deficiency, defined as the presence of microsatellite instability or loss of the MMR protein expression, which is the hallmark of this disorder," the investigators wrote.

Several sets of guidelines have been issued for identifying which colorectal cancer patients should undergo tumor DNA testing to reveal these traits, but none have proved sensitive and specific enough to do an optimal job, and all have been difficult to apply in clinical practice, said Dr. Moreira of the University of Barcelona and her associates.

"Unless there is strong clinical suspicion, the majority of cases remain undetected, leading to the lack of implementation of highly effective preventive measures" including intensive screening by colonoscopy and prophylactic removal of targeted organs, they noted.

By pooling the data from four large cohorts of colorectal cancer patients around the world, the researchers compared the effectiveness of different screening strategies for identifying Lynch syndrome. Universal testing of the tumors for DNA MMR abnormalities was the most effective method, with a sensitivity of 100% and a specificity of 93%.

In comparison, use of the revised Bethesda guidelines will fail to detect approximately 12% of cases, use of the Jerusalem recommendations will fail to detect approximately 15%, and use of a "selective strategy" will fail to detect approximately 5%, the investigators reported. The specificities of the strategies ranged from 95.5% with the "selective strategy" to 97.5% with the revised Bethesda guidelines. The investigators developed the "selective strategy" for screening by using the most sensitive variables in a bivariate analysis: CRC diagnosis at 70 years and fulfillment of at least 1 criterion of the revised Bethesda guidelines.

The diagnostic yield of universal MMR testing followed by germline testing was 2.2%. In comparison, the diagnostic yield of the revised Bethesda guidelines was 2.0%, that of the Jerusalem recommendations was 1.9%, and that of "selective criteria" was 2.1%.

Body

Risk stratification for colon cancer is of great interest to gastroenterologists as it determines the need for colonoscopy (because alternatives are endorsed only for average risk individuals) as well as age of first screening and screening intervals. Lynch syndrome patients are among the highest risk individuals and readily identifying them has been challenging because presentation may involve non-GI cancers and a wide range of strategies have been used for screening.


Dr. Barbara H. Jung

Lynch syndrome is characterized by germline mutations that affect mismatch repair (MMR) protein expression, leading to loss of mismatch repair. Colorectal cancers (CRCs) from classic Lynch syndrome patients will harbor loss of a specific MMR protein as detected by protein staining of the tumor sample and MMR deficiency as seen in marker regions of the tumor cells’ DNA. Sequencing of DNA from peripheral blood cells will reveal true Lynch, as these are the patients with mutations in the germline that may pass on their genetic defect and higher risk to their offsprings, in contrast to mismatch repair deficient tumors that occur due to silencing of a MMR gene only in a tumor.

The study by Dr. Moreira and associates now compares the sensitivity of diagnosing Lynch syndrome in patients with CRC in three large cohorts from Europe and the United States. Not surprisingly, testing tumors from all patients with CRC for expression of MMR proteins picked up the most patients with Lynch. Other strategies came close to the universal testing, such as the modified Bethesda criteria and a "selective" strategy of testing all patients with CRC 70 years or younger and older patients fulfilling at least one criterion of the modified Bethesda criteria.

This study also revealed additional important points: A family history of CRC is not sufficient to detect Lynch, Lynch may occur in older patients, and prevalence in populations may vary. In practice, it may be best to cast a wide net and implement universal testing for MMR protein expression in all patients with CRC, which then may be further worked up rather than using complicated restrictions that will be difficult to employ. Having MMR and genetic status available will be particularly helpful when we see the typical patient in a GI office who is a relative of a patient with CRC.

Barbara H. Jung is an associate professor of medicine in the division of gastroenterology at Northwestern Feinberg School of Medicine, Chicago.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
colorectal cancer testing, DNA mismatch repair genes
Author and Disclosure Information

Author and Disclosure Information

Body

Risk stratification for colon cancer is of great interest to gastroenterologists as it determines the need for colonoscopy (because alternatives are endorsed only for average risk individuals) as well as age of first screening and screening intervals. Lynch syndrome patients are among the highest risk individuals and readily identifying them has been challenging because presentation may involve non-GI cancers and a wide range of strategies have been used for screening.


Dr. Barbara H. Jung

Lynch syndrome is characterized by germline mutations that affect mismatch repair (MMR) protein expression, leading to loss of mismatch repair. Colorectal cancers (CRCs) from classic Lynch syndrome patients will harbor loss of a specific MMR protein as detected by protein staining of the tumor sample and MMR deficiency as seen in marker regions of the tumor cells’ DNA. Sequencing of DNA from peripheral blood cells will reveal true Lynch, as these are the patients with mutations in the germline that may pass on their genetic defect and higher risk to their offsprings, in contrast to mismatch repair deficient tumors that occur due to silencing of a MMR gene only in a tumor.

The study by Dr. Moreira and associates now compares the sensitivity of diagnosing Lynch syndrome in patients with CRC in three large cohorts from Europe and the United States. Not surprisingly, testing tumors from all patients with CRC for expression of MMR proteins picked up the most patients with Lynch. Other strategies came close to the universal testing, such as the modified Bethesda criteria and a "selective" strategy of testing all patients with CRC 70 years or younger and older patients fulfilling at least one criterion of the modified Bethesda criteria.

This study also revealed additional important points: A family history of CRC is not sufficient to detect Lynch, Lynch may occur in older patients, and prevalence in populations may vary. In practice, it may be best to cast a wide net and implement universal testing for MMR protein expression in all patients with CRC, which then may be further worked up rather than using complicated restrictions that will be difficult to employ. Having MMR and genetic status available will be particularly helpful when we see the typical patient in a GI office who is a relative of a patient with CRC.

Barbara H. Jung is an associate professor of medicine in the division of gastroenterology at Northwestern Feinberg School of Medicine, Chicago.

Body

Risk stratification for colon cancer is of great interest to gastroenterologists as it determines the need for colonoscopy (because alternatives are endorsed only for average risk individuals) as well as age of first screening and screening intervals. Lynch syndrome patients are among the highest risk individuals and readily identifying them has been challenging because presentation may involve non-GI cancers and a wide range of strategies have been used for screening.


Dr. Barbara H. Jung

Lynch syndrome is characterized by germline mutations that affect mismatch repair (MMR) protein expression, leading to loss of mismatch repair. Colorectal cancers (CRCs) from classic Lynch syndrome patients will harbor loss of a specific MMR protein as detected by protein staining of the tumor sample and MMR deficiency as seen in marker regions of the tumor cells’ DNA. Sequencing of DNA from peripheral blood cells will reveal true Lynch, as these are the patients with mutations in the germline that may pass on their genetic defect and higher risk to their offsprings, in contrast to mismatch repair deficient tumors that occur due to silencing of a MMR gene only in a tumor.

The study by Dr. Moreira and associates now compares the sensitivity of diagnosing Lynch syndrome in patients with CRC in three large cohorts from Europe and the United States. Not surprisingly, testing tumors from all patients with CRC for expression of MMR proteins picked up the most patients with Lynch. Other strategies came close to the universal testing, such as the modified Bethesda criteria and a "selective" strategy of testing all patients with CRC 70 years or younger and older patients fulfilling at least one criterion of the modified Bethesda criteria.

This study also revealed additional important points: A family history of CRC is not sufficient to detect Lynch, Lynch may occur in older patients, and prevalence in populations may vary. In practice, it may be best to cast a wide net and implement universal testing for MMR protein expression in all patients with CRC, which then may be further worked up rather than using complicated restrictions that will be difficult to employ. Having MMR and genetic status available will be particularly helpful when we see the typical patient in a GI office who is a relative of a patient with CRC.

Barbara H. Jung is an associate professor of medicine in the division of gastroenterology at Northwestern Feinberg School of Medicine, Chicago.

Title
Casting a Wide Net May Work Best
Casting a Wide Net May Work Best

Universal testing of colorectal cancers for DNA mismatch repair genes is the strategy most likely to identify the approximately 1%-3% of patients who have Lynch syndrome, according to a report in the Oct. 17 issue of JAMA.

Correctly identifying the small subgroup of colorectal cancer (CRC) patients who have Lynch syndrome is crucial so that their presymptomatic relatives can be found and all the affected family members can consider preventive measures to limit their morbidity and mortality.

Dr. Leticia Moreira and her colleagues in the EPICOLON consortium, an international network of colon cancer cohorts, pooled data involving 10,206 probands to show that "universal tumor MMR testing followed by germline testing offers the highest sensitivity," compared with other strategies, "although the increase in the diagnostic yield is modest" (JAMA 2012;308:1555-65).

This research highlights the limitations of various methods for identifying the subgroup of colorectal cancer patients who have Lynch syndrome and should remind clinicians "that simply asking about a family history of CRC in a first-degree relative will miss the majority of patients with Lynch syndrome," because only 43% of patients with Lynch syndrome have such a family history, wrote Dr. Uri Ladabaum and Dr. James M. Ford of the gastrointestinal cancer prevention program and clinical cancer genetics program at Stanford (Calif.) University in an editorial accompanying the study (JAMA 2012;308:1581-3).

"The majority of CRC patients do not have Lynch syndrome. But in the haystack of patients with CRC, those with Lynch syndrome are more like large knitting needles than tiny sewing needles – and a systematic search can find them," they wrote.

Lynch syndrome, also known as hereditary nonpolyposis colorectal cancer, is the most common form of familial colorectal cancer and is caused by germline mutations in the DNA mismatch repair (MMR) genes MSH2, MLH1, MSH6, and PMS2. Dysfunction of these genes causes an accumulation of errors during DNA replication, particularly in the repetitive sequences called microsatellites.

"As a result, tumors of patients with Lynch syndrome characteristically demonstrate MMR deficiency, defined as the presence of microsatellite instability or loss of the MMR protein expression, which is the hallmark of this disorder," the investigators wrote.

Several sets of guidelines have been issued for identifying which colorectal cancer patients should undergo tumor DNA testing to reveal these traits, but none have proved sensitive and specific enough to do an optimal job, and all have been difficult to apply in clinical practice, said Dr. Moreira of the University of Barcelona and her associates.

"Unless there is strong clinical suspicion, the majority of cases remain undetected, leading to the lack of implementation of highly effective preventive measures" including intensive screening by colonoscopy and prophylactic removal of targeted organs, they noted.

By pooling the data from four large cohorts of colorectal cancer patients around the world, the researchers compared the effectiveness of different screening strategies for identifying Lynch syndrome. Universal testing of the tumors for DNA MMR abnormalities was the most effective method, with a sensitivity of 100% and a specificity of 93%.

In comparison, use of the revised Bethesda guidelines will fail to detect approximately 12% of cases, use of the Jerusalem recommendations will fail to detect approximately 15%, and use of a "selective strategy" will fail to detect approximately 5%, the investigators reported. The specificities of the strategies ranged from 95.5% with the "selective strategy" to 97.5% with the revised Bethesda guidelines. The investigators developed the "selective strategy" for screening by using the most sensitive variables in a bivariate analysis: CRC diagnosis at 70 years and fulfillment of at least 1 criterion of the revised Bethesda guidelines.

The diagnostic yield of universal MMR testing followed by germline testing was 2.2%. In comparison, the diagnostic yield of the revised Bethesda guidelines was 2.0%, that of the Jerusalem recommendations was 1.9%, and that of "selective criteria" was 2.1%.

Universal testing of colorectal cancers for DNA mismatch repair genes is the strategy most likely to identify the approximately 1%-3% of patients who have Lynch syndrome, according to a report in the Oct. 17 issue of JAMA.

Correctly identifying the small subgroup of colorectal cancer (CRC) patients who have Lynch syndrome is crucial so that their presymptomatic relatives can be found and all the affected family members can consider preventive measures to limit their morbidity and mortality.

Dr. Leticia Moreira and her colleagues in the EPICOLON consortium, an international network of colon cancer cohorts, pooled data involving 10,206 probands to show that "universal tumor MMR testing followed by germline testing offers the highest sensitivity," compared with other strategies, "although the increase in the diagnostic yield is modest" (JAMA 2012;308:1555-65).

This research highlights the limitations of various methods for identifying the subgroup of colorectal cancer patients who have Lynch syndrome and should remind clinicians "that simply asking about a family history of CRC in a first-degree relative will miss the majority of patients with Lynch syndrome," because only 43% of patients with Lynch syndrome have such a family history, wrote Dr. Uri Ladabaum and Dr. James M. Ford of the gastrointestinal cancer prevention program and clinical cancer genetics program at Stanford (Calif.) University in an editorial accompanying the study (JAMA 2012;308:1581-3).

"The majority of CRC patients do not have Lynch syndrome. But in the haystack of patients with CRC, those with Lynch syndrome are more like large knitting needles than tiny sewing needles – and a systematic search can find them," they wrote.

Lynch syndrome, also known as hereditary nonpolyposis colorectal cancer, is the most common form of familial colorectal cancer and is caused by germline mutations in the DNA mismatch repair (MMR) genes MSH2, MLH1, MSH6, and PMS2. Dysfunction of these genes causes an accumulation of errors during DNA replication, particularly in the repetitive sequences called microsatellites.

"As a result, tumors of patients with Lynch syndrome characteristically demonstrate MMR deficiency, defined as the presence of microsatellite instability or loss of the MMR protein expression, which is the hallmark of this disorder," the investigators wrote.

Several sets of guidelines have been issued for identifying which colorectal cancer patients should undergo tumor DNA testing to reveal these traits, but none have proved sensitive and specific enough to do an optimal job, and all have been difficult to apply in clinical practice, said Dr. Moreira of the University of Barcelona and her associates.

"Unless there is strong clinical suspicion, the majority of cases remain undetected, leading to the lack of implementation of highly effective preventive measures" including intensive screening by colonoscopy and prophylactic removal of targeted organs, they noted.

By pooling the data from four large cohorts of colorectal cancer patients around the world, the researchers compared the effectiveness of different screening strategies for identifying Lynch syndrome. Universal testing of the tumors for DNA MMR abnormalities was the most effective method, with a sensitivity of 100% and a specificity of 93%.

In comparison, use of the revised Bethesda guidelines will fail to detect approximately 12% of cases, use of the Jerusalem recommendations will fail to detect approximately 15%, and use of a "selective strategy" will fail to detect approximately 5%, the investigators reported. The specificities of the strategies ranged from 95.5% with the "selective strategy" to 97.5% with the revised Bethesda guidelines. The investigators developed the "selective strategy" for screening by using the most sensitive variables in a bivariate analysis: CRC diagnosis at 70 years and fulfillment of at least 1 criterion of the revised Bethesda guidelines.

The diagnostic yield of universal MMR testing followed by germline testing was 2.2%. In comparison, the diagnostic yield of the revised Bethesda guidelines was 2.0%, that of the Jerusalem recommendations was 1.9%, and that of "selective criteria" was 2.1%.

Publications
Publications
Topics
Article Type
Display Headline
Best Lynch Screening Strategy Identified
Display Headline
Best Lynch Screening Strategy Identified
Legacy Keywords
colorectal cancer testing, DNA mismatch repair genes
Legacy Keywords
colorectal cancer testing, DNA mismatch repair genes
Article Source

PURLs Copyright

Inside the Article

Vitals

Major Finding: Universal screening of colorectal cancers’ DNA for MMR abnormalities had a sensitivity of 100% and a specificity of 93% in identifying Lynch syndrome.

Data Source: A pooled data analysis of screening strategies in 10,206 probands with colorectal cancer, of whom 1,386 (14%) proved to have the tumor MMR deficiency characteristic of Lynch syndrome.

Disclosures: This study was supported by the Ministerio de Economia y Competividad, the Agnecia de Gestio-d’Ajuts Universitaris i de Recerca, the Asociacion Espaniola contra el Cancer, the Hospital Clinic of Barcelona, and the Instituto de Salud Carlos III. No financial conflicts of interest were reported. Dr. Ladabaum reported ties to Given Imaging, GE Healthcare, Abbott Molecular, Quest Diagnostics, RA Capital, Roche, Vaxart, Endosphere, and Epigenomics. Dr. Ford reported having consulted for Bristol-Myers Squibb.

Ahead of the Journals: Discontinuing Risperidone in AD

Article Type
Changed
Display Headline
Ahead of the Journals: Discontinuing Risperidone in AD

Among patients with Alzheimer’s disease who develop psychosis or agitation-aggression that responds to risperidone, discontinuing the drug as advised after 3-6 months is associated with a doubling of the rate of relapse, according to a study published Oct. 17 in the New England Journal of Medicine.

"Federal regulations for nursing homes strongly urge discontinuation of antipsychotic drugs after 3-6 months of treatment" because of concern about adverse effects, even though "evidence from controlled trials in support of this long-standing regulation is very limited," said Dr. D.P. Devanand of the New York State Psychiatric Institute and Columbia University, New York, and his associates.

Dr. D. P. Devanand

"Our findings suggest that patients with psychosis or agitation-aggression who have a sustained response to antipsychotic treatment for 4-8 months have a significantly increased risk of relapse for at least 4 months after discontinuation, and this finding should be weighed against the risk of adverse effects with continued antipsychotic treatment," they noted (N. Engl. J. Med. 2012 [doi:10.1056/NEJMoa1114058]).

Dr. Devanand and his colleagues presented their findings in July at the Alzheimer’s Association International Conference; read our coverage here.

This study was funded by the National Institutes of Health and the U.S. Department of Veterans Affairs. Dr. Devanand and his associates reported numerous ties to industry sources.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
patients with Alzheimer's disease, AD psychosis, agitation-aggression, risperidone Alzheimer's
Author and Disclosure Information

Author and Disclosure Information

Related Articles

Among patients with Alzheimer’s disease who develop psychosis or agitation-aggression that responds to risperidone, discontinuing the drug as advised after 3-6 months is associated with a doubling of the rate of relapse, according to a study published Oct. 17 in the New England Journal of Medicine.

"Federal regulations for nursing homes strongly urge discontinuation of antipsychotic drugs after 3-6 months of treatment" because of concern about adverse effects, even though "evidence from controlled trials in support of this long-standing regulation is very limited," said Dr. D.P. Devanand of the New York State Psychiatric Institute and Columbia University, New York, and his associates.

Dr. D. P. Devanand

"Our findings suggest that patients with psychosis or agitation-aggression who have a sustained response to antipsychotic treatment for 4-8 months have a significantly increased risk of relapse for at least 4 months after discontinuation, and this finding should be weighed against the risk of adverse effects with continued antipsychotic treatment," they noted (N. Engl. J. Med. 2012 [doi:10.1056/NEJMoa1114058]).

Dr. Devanand and his colleagues presented their findings in July at the Alzheimer’s Association International Conference; read our coverage here.

This study was funded by the National Institutes of Health and the U.S. Department of Veterans Affairs. Dr. Devanand and his associates reported numerous ties to industry sources.

Among patients with Alzheimer’s disease who develop psychosis or agitation-aggression that responds to risperidone, discontinuing the drug as advised after 3-6 months is associated with a doubling of the rate of relapse, according to a study published Oct. 17 in the New England Journal of Medicine.

"Federal regulations for nursing homes strongly urge discontinuation of antipsychotic drugs after 3-6 months of treatment" because of concern about adverse effects, even though "evidence from controlled trials in support of this long-standing regulation is very limited," said Dr. D.P. Devanand of the New York State Psychiatric Institute and Columbia University, New York, and his associates.

Dr. D. P. Devanand

"Our findings suggest that patients with psychosis or agitation-aggression who have a sustained response to antipsychotic treatment for 4-8 months have a significantly increased risk of relapse for at least 4 months after discontinuation, and this finding should be weighed against the risk of adverse effects with continued antipsychotic treatment," they noted (N. Engl. J. Med. 2012 [doi:10.1056/NEJMoa1114058]).

Dr. Devanand and his colleagues presented their findings in July at the Alzheimer’s Association International Conference; read our coverage here.

This study was funded by the National Institutes of Health and the U.S. Department of Veterans Affairs. Dr. Devanand and his associates reported numerous ties to industry sources.

Publications
Publications
Topics
Article Type
Display Headline
Ahead of the Journals: Discontinuing Risperidone in AD
Display Headline
Ahead of the Journals: Discontinuing Risperidone in AD
Legacy Keywords
patients with Alzheimer's disease, AD psychosis, agitation-aggression, risperidone Alzheimer's
Legacy Keywords
patients with Alzheimer's disease, AD psychosis, agitation-aggression, risperidone Alzheimer's
Article Source

FROM THE NEW ENGLAND JOURNAL OF MEDICINE

PURLs Copyright

Inside the Article

Vitals

Major Finding: The rate of relapse of psychosis or agitation-aggression was nearly twice as high in AD patients who discontinued risperidone (60%) as in those who continued the drug (33%).

Data Source: Results were taken from a randomized clinical trial in which 180 AD patients with psychosis or agitation-aggression received flexible-dose risperidone for16 weeks and those who responded were randomly assigned to either continue or discontinue the drug while being followed for another 32 weeks.

Disclosures: This study was funded by the National Institutes of Health and the U.S. Department of Veterans Affairs. Dr. Devanand and his associates reported numerous ties to industry sources.

Anti-TNF Resistant Crohn's Disease May Respond to Ustekinumab

Article Type
Changed
Display Headline
Anti-TNF Resistant Crohn's Disease May Respond to Ustekinumab

Ustekinumab induced a clinical response in patients with moderate to severe Crohn’s disease that was resistant to tumor necrosis factor antagonists, in a phase IIb clinical trial published online Oct. 17 in the New England Journal of Medicine.

However, the agent did not improve remission rates, compared with placebo, said Dr. William J. Sandborn, professor of medicine and chief of the division of gastroenterology at the University of California San Diego, La Jolla, and his associates.

Dr. William J. Sandborn

"A sizable proportion" of patients with moderate to severe Crohn’s disease do not respond to TNF antagonists, have an unsustained response, or must discontinue the medications because of adverse effects. After ustekinumab showed efficacy in such patients in a phase IIa clinical study, Dr. Sandborn and his colleagues performed a 36-week double-blind phase II2b trial in 526 adults at 153 medical centers in 12 countries.

Ustekinumab, a human IgG monoclonal antibody that inhibits the receptors for interleukin-12 and interleukin-23 on T cells, natural killer cells, and antigen-presenting cells, has Food and Drug Administration approval for use in plaque psoriasis. This clinical trial was sponsored by an affiliate of the manufacturer, Janssen Biotech.

During an 8-week induction phase, the study subjects were randomly assigned to receive intravenous placebo (132 patients) or ustekinumab in 1-mg/kg (131 patients), 3-mg/kg (132 patients), or 6-mg/kg (131 patients) doses. Then, during weeks 8-36, the study subjects who showed a response to induction therapy and those who did not show a response were separately randomized to receive either subcutaneous ustekinumab (90 mg) or placebo at week 8 and week 16, as maintenance therapy.

Treatment efficacy was assessed at week 22, and patients were followed through week 36 for a safety analysis. A total of 36.1% of the subjects discontinued the study before week 36.

The primary end point was a clinical response, defined as a decrease of 100 points or more on the Crohn’s Disease Activity Index (CDAI) score.

A total of 39.7% of patients receiving the 6-mg induction dose showed a clinical response, which was significantly greater than the 23.5% of patients receiving placebo, the investigators said (New Engl. J. Med. 2012 [doi:10.1056/NEJMoa1203572]).

A greater number of patients receiving the lower doses of ustekinumab than receiving placebo showed a clinical response, but the differences between these low-dose groups and the placebo group did not reach statistical significance.

The 6-mg/kg dose was effective across most demographic and disease characteristics, judging from the findings of a subgroup analysis. It was consistently effective in patients who had failed on their first attempt at therapy with TNF antagonists, patients who had failed on two or more TNF antagonists, and patients who had only had a transient response to TNF antagonists.

However, rates of clinical remission did not differ significantly between patients receiving ustekinumab and those receiving placebo, Dr. Sandborn and his associates said.

At all follow-up visits, the proportion of patients who had a 70-point clinical response was significantly higher, the reductions in mean CDAI scores were significantly greater, and the reductions in C-reactive protein levels were significantly greater in patients receiving 6 mg per kg of ustekinumab than in the placebo group.

As a maintenance therapy, 90 mg of subcutaneous ustekinumab appeared to be effective in patients who responded to the induction dose of the agent. The proportion of patients who showed a clinical response at week 22 was 69.4% in those receiving maintenance ustekinumab, significantly greater than the 42.5% response rate among those receiving maintenance placebo.

Among patients who responded to induction-phase ustekinumab, 41.7% of those who also received maintenance ustekinumab achieved clinical remission at week 22, compared with only 27.4% of those who received maintenance placebo.

Similarly, among patients who showed a response to induction ustekinumab, reductions in both CDAI scores and CRP levels were sustained if they continued on maintenance ustekinumab but were not sustained if they continued on placebo for the maintenance period.

However, patients who did not show a response to induction ustekinumab also did not benefit from additional ustekinumab in the maintenance phase of the study.

The results of the safety analysis were "somewhat limited" by the small sample size and the short duration of treatment. No deaths, serious opportunistic infections, or major adverse cardiovascular events were reported, "but large studies of longer duration are needed to assess uncommon adverse events," the investigators said.

Of note, one patient receiving ustekinumab as both induction and maintenance therapy developed a basal cell carcinoma. Among patients taking ustekinumab in the induction phase of the study, six developed serious infections: Clostridium difficile, viral gastroenteritis, UTI, anal abscess, vaginal abscess, and a staph infection of a central catheter.

 

 

This study was sponsored by Janssen Research and Development; Janssen Biotech makes ustekinumab. Dr. Sandborn and his associates reported numerous ties to industry sources.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
ustekinumab Crohn's disease, Crohn's disease treatment, tumor necrosis factor antagonists, Dr. William J. Sandborn
Author and Disclosure Information

Author and Disclosure Information

Ustekinumab induced a clinical response in patients with moderate to severe Crohn’s disease that was resistant to tumor necrosis factor antagonists, in a phase IIb clinical trial published online Oct. 17 in the New England Journal of Medicine.

However, the agent did not improve remission rates, compared with placebo, said Dr. William J. Sandborn, professor of medicine and chief of the division of gastroenterology at the University of California San Diego, La Jolla, and his associates.

Dr. William J. Sandborn

"A sizable proportion" of patients with moderate to severe Crohn’s disease do not respond to TNF antagonists, have an unsustained response, or must discontinue the medications because of adverse effects. After ustekinumab showed efficacy in such patients in a phase IIa clinical study, Dr. Sandborn and his colleagues performed a 36-week double-blind phase II2b trial in 526 adults at 153 medical centers in 12 countries.

Ustekinumab, a human IgG monoclonal antibody that inhibits the receptors for interleukin-12 and interleukin-23 on T cells, natural killer cells, and antigen-presenting cells, has Food and Drug Administration approval for use in plaque psoriasis. This clinical trial was sponsored by an affiliate of the manufacturer, Janssen Biotech.

During an 8-week induction phase, the study subjects were randomly assigned to receive intravenous placebo (132 patients) or ustekinumab in 1-mg/kg (131 patients), 3-mg/kg (132 patients), or 6-mg/kg (131 patients) doses. Then, during weeks 8-36, the study subjects who showed a response to induction therapy and those who did not show a response were separately randomized to receive either subcutaneous ustekinumab (90 mg) or placebo at week 8 and week 16, as maintenance therapy.

Treatment efficacy was assessed at week 22, and patients were followed through week 36 for a safety analysis. A total of 36.1% of the subjects discontinued the study before week 36.

The primary end point was a clinical response, defined as a decrease of 100 points or more on the Crohn’s Disease Activity Index (CDAI) score.

A total of 39.7% of patients receiving the 6-mg induction dose showed a clinical response, which was significantly greater than the 23.5% of patients receiving placebo, the investigators said (New Engl. J. Med. 2012 [doi:10.1056/NEJMoa1203572]).

A greater number of patients receiving the lower doses of ustekinumab than receiving placebo showed a clinical response, but the differences between these low-dose groups and the placebo group did not reach statistical significance.

The 6-mg/kg dose was effective across most demographic and disease characteristics, judging from the findings of a subgroup analysis. It was consistently effective in patients who had failed on their first attempt at therapy with TNF antagonists, patients who had failed on two or more TNF antagonists, and patients who had only had a transient response to TNF antagonists.

However, rates of clinical remission did not differ significantly between patients receiving ustekinumab and those receiving placebo, Dr. Sandborn and his associates said.

At all follow-up visits, the proportion of patients who had a 70-point clinical response was significantly higher, the reductions in mean CDAI scores were significantly greater, and the reductions in C-reactive protein levels were significantly greater in patients receiving 6 mg per kg of ustekinumab than in the placebo group.

As a maintenance therapy, 90 mg of subcutaneous ustekinumab appeared to be effective in patients who responded to the induction dose of the agent. The proportion of patients who showed a clinical response at week 22 was 69.4% in those receiving maintenance ustekinumab, significantly greater than the 42.5% response rate among those receiving maintenance placebo.

Among patients who responded to induction-phase ustekinumab, 41.7% of those who also received maintenance ustekinumab achieved clinical remission at week 22, compared with only 27.4% of those who received maintenance placebo.

Similarly, among patients who showed a response to induction ustekinumab, reductions in both CDAI scores and CRP levels were sustained if they continued on maintenance ustekinumab but were not sustained if they continued on placebo for the maintenance period.

However, patients who did not show a response to induction ustekinumab also did not benefit from additional ustekinumab in the maintenance phase of the study.

The results of the safety analysis were "somewhat limited" by the small sample size and the short duration of treatment. No deaths, serious opportunistic infections, or major adverse cardiovascular events were reported, "but large studies of longer duration are needed to assess uncommon adverse events," the investigators said.

Of note, one patient receiving ustekinumab as both induction and maintenance therapy developed a basal cell carcinoma. Among patients taking ustekinumab in the induction phase of the study, six developed serious infections: Clostridium difficile, viral gastroenteritis, UTI, anal abscess, vaginal abscess, and a staph infection of a central catheter.

 

 

This study was sponsored by Janssen Research and Development; Janssen Biotech makes ustekinumab. Dr. Sandborn and his associates reported numerous ties to industry sources.

Ustekinumab induced a clinical response in patients with moderate to severe Crohn’s disease that was resistant to tumor necrosis factor antagonists, in a phase IIb clinical trial published online Oct. 17 in the New England Journal of Medicine.

However, the agent did not improve remission rates, compared with placebo, said Dr. William J. Sandborn, professor of medicine and chief of the division of gastroenterology at the University of California San Diego, La Jolla, and his associates.

Dr. William J. Sandborn

"A sizable proportion" of patients with moderate to severe Crohn’s disease do not respond to TNF antagonists, have an unsustained response, or must discontinue the medications because of adverse effects. After ustekinumab showed efficacy in such patients in a phase IIa clinical study, Dr. Sandborn and his colleagues performed a 36-week double-blind phase II2b trial in 526 adults at 153 medical centers in 12 countries.

Ustekinumab, a human IgG monoclonal antibody that inhibits the receptors for interleukin-12 and interleukin-23 on T cells, natural killer cells, and antigen-presenting cells, has Food and Drug Administration approval for use in plaque psoriasis. This clinical trial was sponsored by an affiliate of the manufacturer, Janssen Biotech.

During an 8-week induction phase, the study subjects were randomly assigned to receive intravenous placebo (132 patients) or ustekinumab in 1-mg/kg (131 patients), 3-mg/kg (132 patients), or 6-mg/kg (131 patients) doses. Then, during weeks 8-36, the study subjects who showed a response to induction therapy and those who did not show a response were separately randomized to receive either subcutaneous ustekinumab (90 mg) or placebo at week 8 and week 16, as maintenance therapy.

Treatment efficacy was assessed at week 22, and patients were followed through week 36 for a safety analysis. A total of 36.1% of the subjects discontinued the study before week 36.

The primary end point was a clinical response, defined as a decrease of 100 points or more on the Crohn’s Disease Activity Index (CDAI) score.

A total of 39.7% of patients receiving the 6-mg induction dose showed a clinical response, which was significantly greater than the 23.5% of patients receiving placebo, the investigators said (New Engl. J. Med. 2012 [doi:10.1056/NEJMoa1203572]).

A greater number of patients receiving the lower doses of ustekinumab than receiving placebo showed a clinical response, but the differences between these low-dose groups and the placebo group did not reach statistical significance.

The 6-mg/kg dose was effective across most demographic and disease characteristics, judging from the findings of a subgroup analysis. It was consistently effective in patients who had failed on their first attempt at therapy with TNF antagonists, patients who had failed on two or more TNF antagonists, and patients who had only had a transient response to TNF antagonists.

However, rates of clinical remission did not differ significantly between patients receiving ustekinumab and those receiving placebo, Dr. Sandborn and his associates said.

At all follow-up visits, the proportion of patients who had a 70-point clinical response was significantly higher, the reductions in mean CDAI scores were significantly greater, and the reductions in C-reactive protein levels were significantly greater in patients receiving 6 mg per kg of ustekinumab than in the placebo group.

As a maintenance therapy, 90 mg of subcutaneous ustekinumab appeared to be effective in patients who responded to the induction dose of the agent. The proportion of patients who showed a clinical response at week 22 was 69.4% in those receiving maintenance ustekinumab, significantly greater than the 42.5% response rate among those receiving maintenance placebo.

Among patients who responded to induction-phase ustekinumab, 41.7% of those who also received maintenance ustekinumab achieved clinical remission at week 22, compared with only 27.4% of those who received maintenance placebo.

Similarly, among patients who showed a response to induction ustekinumab, reductions in both CDAI scores and CRP levels were sustained if they continued on maintenance ustekinumab but were not sustained if they continued on placebo for the maintenance period.

However, patients who did not show a response to induction ustekinumab also did not benefit from additional ustekinumab in the maintenance phase of the study.

The results of the safety analysis were "somewhat limited" by the small sample size and the short duration of treatment. No deaths, serious opportunistic infections, or major adverse cardiovascular events were reported, "but large studies of longer duration are needed to assess uncommon adverse events," the investigators said.

Of note, one patient receiving ustekinumab as both induction and maintenance therapy developed a basal cell carcinoma. Among patients taking ustekinumab in the induction phase of the study, six developed serious infections: Clostridium difficile, viral gastroenteritis, UTI, anal abscess, vaginal abscess, and a staph infection of a central catheter.

 

 

This study was sponsored by Janssen Research and Development; Janssen Biotech makes ustekinumab. Dr. Sandborn and his associates reported numerous ties to industry sources.

Publications
Publications
Topics
Article Type
Display Headline
Anti-TNF Resistant Crohn's Disease May Respond to Ustekinumab
Display Headline
Anti-TNF Resistant Crohn's Disease May Respond to Ustekinumab
Legacy Keywords
ustekinumab Crohn's disease, Crohn's disease treatment, tumor necrosis factor antagonists, Dr. William J. Sandborn
Legacy Keywords
ustekinumab Crohn's disease, Crohn's disease treatment, tumor necrosis factor antagonists, Dr. William J. Sandborn
Article Source

FROM THE NEW ENGLAND JOURNAL OF MEDICINE

PURLs Copyright

Inside the Article

Vitals

Major Finding: Of patients with moderate to severe Crohn's disease who received ustekinumab (6 mg/kg), 39.7% showed a decrease of 100 points or more in CDAI score, compared with 23.5% of those who received placebo.

Data Source: The data come from a 36-week,international phase IIb randomized clinical trial comparing 3 doses of ustekinumab with placebo in 526 adults who had refractory Crohn’ disease.

Disclosures: This study was sponsored by Janssen Research and Development; Janssen Biotech makes ustekinumab. Dr. Sandborn and his associates reported numerous ties to industry sources.

Mean Serum Lipids Have Improved Since Late 1980s

Article Type
Changed
Display Headline
Mean Serum Lipids Have Improved Since Late 1980s

Mean serum levels of lipids have improved among American adults since the late 1980s, according to an analysis of data from three nationwide surveys, published Oct. 17 in JAMA.

Between 1988 and 2012, mean total cholesterol, non-HDL cholesterol, and LDL cholesterol levels have declined and mean HDL cholesterol levels have risen in adults overall as well as across most racial/ethnic and gender categories, said Margaret D. Carroll of the Division of Health and Nutrition Examination Surveys, National Center for Health Statistics, Hyattsville, Md., and her associates.

The investigators examined data from the National Health and Nutrition Examination Surveys (NHANES) for 1988-1994, 1999-2002, and 2007-2010 to track temporal trends in lipid levels. Each cross-sectional survey included health-related interviews and physical examinations of a nationally representative sample of tens of thousands of adults.

In the adult population as a whole, the mean total cholesterol level declined in a linear fashion from 206 mg/dL in 1988-1994 to 203 mg/dL in 1999-2002 and to 196 mg/dL in 2007-2010. This pattern remained the same in separate analyses of men and women and in all racial/ethnic subgroups, except for Mexican American men (JAMA 2012;308:1545-54).

LDL cholesterol also decreased in a linear fashion in all adults, from 128 mg/dL in the first survey to 124 mg/dL in the second and to 119 mg/dL in the final survey. LDL cholesterol levels declined in both men and women and eventually converged, so there was no longer a difference between the sexes in the latest survey.

HDL cholesterol rose in a linear fashion from 50.7 mg/dL to 52.5 mg/dL among all adults, and also rose in separate analyses of both sexes. When the data were broken down by racial/ethnic categories, HDL cholesterol rose in whites of both sexes but not in blacks or Mexican Americans.

Triglycerides showed a slightly different pattern, rising between the late 1980s (118 mg/dL) and the early 2000s (123 mg/dL), but then falling again by 2010 (110 mg/dL).

All of these temporal trends persisted when the analysis was restricted to only the oldest adults, aged 50 and older.

During the study period, the number of adults taking lipid-lowering medications also rose, from 3.4% in 1988-1994 to 9.3% in 1999-2002 and to 15.5% in 2007-2010. However, these medications did not explain the entire improvement in lipid profiles, which also improved markedly in adults who weren’t taking them.

"The Healthy People 2010 guideline of an age-adjusted mean total cholesterol level of 200 mg/dL or less has been achieved in [all] adults, in men, in women, and in all race/ethnicity and sex subgroups. However, the age-adjusted mean LDL cholesterol level in adults of 116 mg/dL [remains] higher than the optimal range of below 100 mg/dL, [which is] associated with a lower risk of CHD [coronary heart disease]," Ms. Carroll and her colleagues said.

The researchers speculated that the favorable trends in lipid profiles might be attributable in part to a decrease in the consumption of trans-fatty acids and to other healthy lifestyle changes. The changes are unlikely to have resulted from increases in physical activity or decreases in the intake of saturated fat, as other studies have demonstrated that activity has not increased and saturated fat intake has not declined during the study period, they said.

This laboratory analysis of lipids was funded by the National Heart, Lung, and Blood Institute. NHANES is conducted by the National Center for Health Statistics and the Centers for Disease Control and Prevention. No conflicts of interest were reported.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
serum levels, lipids, JAMA, total cholesterol, non-HDL cholesterol, LDL cholesterol levels, HDL cholesterol, Margaret D. Carroll, Division of Health and Nutrition Examination Surveys, National Center for Health Statistics, National Health and Nutrition Examination Surveys, NHANES,
Triglycerides,
Author and Disclosure Information

Author and Disclosure Information

Mean serum levels of lipids have improved among American adults since the late 1980s, according to an analysis of data from three nationwide surveys, published Oct. 17 in JAMA.

Between 1988 and 2012, mean total cholesterol, non-HDL cholesterol, and LDL cholesterol levels have declined and mean HDL cholesterol levels have risen in adults overall as well as across most racial/ethnic and gender categories, said Margaret D. Carroll of the Division of Health and Nutrition Examination Surveys, National Center for Health Statistics, Hyattsville, Md., and her associates.

The investigators examined data from the National Health and Nutrition Examination Surveys (NHANES) for 1988-1994, 1999-2002, and 2007-2010 to track temporal trends in lipid levels. Each cross-sectional survey included health-related interviews and physical examinations of a nationally representative sample of tens of thousands of adults.

In the adult population as a whole, the mean total cholesterol level declined in a linear fashion from 206 mg/dL in 1988-1994 to 203 mg/dL in 1999-2002 and to 196 mg/dL in 2007-2010. This pattern remained the same in separate analyses of men and women and in all racial/ethnic subgroups, except for Mexican American men (JAMA 2012;308:1545-54).

LDL cholesterol also decreased in a linear fashion in all adults, from 128 mg/dL in the first survey to 124 mg/dL in the second and to 119 mg/dL in the final survey. LDL cholesterol levels declined in both men and women and eventually converged, so there was no longer a difference between the sexes in the latest survey.

HDL cholesterol rose in a linear fashion from 50.7 mg/dL to 52.5 mg/dL among all adults, and also rose in separate analyses of both sexes. When the data were broken down by racial/ethnic categories, HDL cholesterol rose in whites of both sexes but not in blacks or Mexican Americans.

Triglycerides showed a slightly different pattern, rising between the late 1980s (118 mg/dL) and the early 2000s (123 mg/dL), but then falling again by 2010 (110 mg/dL).

All of these temporal trends persisted when the analysis was restricted to only the oldest adults, aged 50 and older.

During the study period, the number of adults taking lipid-lowering medications also rose, from 3.4% in 1988-1994 to 9.3% in 1999-2002 and to 15.5% in 2007-2010. However, these medications did not explain the entire improvement in lipid profiles, which also improved markedly in adults who weren’t taking them.

"The Healthy People 2010 guideline of an age-adjusted mean total cholesterol level of 200 mg/dL or less has been achieved in [all] adults, in men, in women, and in all race/ethnicity and sex subgroups. However, the age-adjusted mean LDL cholesterol level in adults of 116 mg/dL [remains] higher than the optimal range of below 100 mg/dL, [which is] associated with a lower risk of CHD [coronary heart disease]," Ms. Carroll and her colleagues said.

The researchers speculated that the favorable trends in lipid profiles might be attributable in part to a decrease in the consumption of trans-fatty acids and to other healthy lifestyle changes. The changes are unlikely to have resulted from increases in physical activity or decreases in the intake of saturated fat, as other studies have demonstrated that activity has not increased and saturated fat intake has not declined during the study period, they said.

This laboratory analysis of lipids was funded by the National Heart, Lung, and Blood Institute. NHANES is conducted by the National Center for Health Statistics and the Centers for Disease Control and Prevention. No conflicts of interest were reported.

Mean serum levels of lipids have improved among American adults since the late 1980s, according to an analysis of data from three nationwide surveys, published Oct. 17 in JAMA.

Between 1988 and 2012, mean total cholesterol, non-HDL cholesterol, and LDL cholesterol levels have declined and mean HDL cholesterol levels have risen in adults overall as well as across most racial/ethnic and gender categories, said Margaret D. Carroll of the Division of Health and Nutrition Examination Surveys, National Center for Health Statistics, Hyattsville, Md., and her associates.

The investigators examined data from the National Health and Nutrition Examination Surveys (NHANES) for 1988-1994, 1999-2002, and 2007-2010 to track temporal trends in lipid levels. Each cross-sectional survey included health-related interviews and physical examinations of a nationally representative sample of tens of thousands of adults.

In the adult population as a whole, the mean total cholesterol level declined in a linear fashion from 206 mg/dL in 1988-1994 to 203 mg/dL in 1999-2002 and to 196 mg/dL in 2007-2010. This pattern remained the same in separate analyses of men and women and in all racial/ethnic subgroups, except for Mexican American men (JAMA 2012;308:1545-54).

LDL cholesterol also decreased in a linear fashion in all adults, from 128 mg/dL in the first survey to 124 mg/dL in the second and to 119 mg/dL in the final survey. LDL cholesterol levels declined in both men and women and eventually converged, so there was no longer a difference between the sexes in the latest survey.

HDL cholesterol rose in a linear fashion from 50.7 mg/dL to 52.5 mg/dL among all adults, and also rose in separate analyses of both sexes. When the data were broken down by racial/ethnic categories, HDL cholesterol rose in whites of both sexes but not in blacks or Mexican Americans.

Triglycerides showed a slightly different pattern, rising between the late 1980s (118 mg/dL) and the early 2000s (123 mg/dL), but then falling again by 2010 (110 mg/dL).

All of these temporal trends persisted when the analysis was restricted to only the oldest adults, aged 50 and older.

During the study period, the number of adults taking lipid-lowering medications also rose, from 3.4% in 1988-1994 to 9.3% in 1999-2002 and to 15.5% in 2007-2010. However, these medications did not explain the entire improvement in lipid profiles, which also improved markedly in adults who weren’t taking them.

"The Healthy People 2010 guideline of an age-adjusted mean total cholesterol level of 200 mg/dL or less has been achieved in [all] adults, in men, in women, and in all race/ethnicity and sex subgroups. However, the age-adjusted mean LDL cholesterol level in adults of 116 mg/dL [remains] higher than the optimal range of below 100 mg/dL, [which is] associated with a lower risk of CHD [coronary heart disease]," Ms. Carroll and her colleagues said.

The researchers speculated that the favorable trends in lipid profiles might be attributable in part to a decrease in the consumption of trans-fatty acids and to other healthy lifestyle changes. The changes are unlikely to have resulted from increases in physical activity or decreases in the intake of saturated fat, as other studies have demonstrated that activity has not increased and saturated fat intake has not declined during the study period, they said.

This laboratory analysis of lipids was funded by the National Heart, Lung, and Blood Institute. NHANES is conducted by the National Center for Health Statistics and the Centers for Disease Control and Prevention. No conflicts of interest were reported.

Publications
Publications
Topics
Article Type
Display Headline
Mean Serum Lipids Have Improved Since Late 1980s
Display Headline
Mean Serum Lipids Have Improved Since Late 1980s
Legacy Keywords
serum levels, lipids, JAMA, total cholesterol, non-HDL cholesterol, LDL cholesterol levels, HDL cholesterol, Margaret D. Carroll, Division of Health and Nutrition Examination Surveys, National Center for Health Statistics, National Health and Nutrition Examination Surveys, NHANES,
Triglycerides,
Legacy Keywords
serum levels, lipids, JAMA, total cholesterol, non-HDL cholesterol, LDL cholesterol levels, HDL cholesterol, Margaret D. Carroll, Division of Health and Nutrition Examination Surveys, National Center for Health Statistics, National Health and Nutrition Examination Surveys, NHANES,
Triglycerides,
Article Source

FROM JAMA

PURLs Copyright

Inside the Article

Vitals

Major Finding: Total cholesterol, LDL cholesterol, and non-HDL cholesterol all decreased and HDL cholesterol increased during the study period of 22 years.

Data Source: This was an analysis of temporal trends in lipid profiles among 16,573 adults who participated in the National Health and Nutrition Examination Surveys (NHANES) in 1988-1994, 9,471 who participated in 1999-2002, and 11,766 who participated in 2007-2010.

Disclosures: This laboratory analysis of lipids was funded by the National Heart, Lung, and Blood Institute. NHANES is conducted by the National Center for Health Statistics and the Centers for Disease Control and Prevention. No conflicts of interest were reported.

Screening All Colorectal Tumors Detects Lynch Syndrome Best

Article Type
Changed
Display Headline
Screening All Colorectal Tumors Detects Lynch Syndrome Best

Universal testing of colorectal cancers for DNA mismatch repair genes is the strategy most likely to identify the approximately 1%-3% of patients who have Lynch syndrome, according to a report in the Oct. 17 issue of JAMA.

Correctly identifying the small subgroup of colorectal cancer (CRC) patients who have Lynch syndrome is crucial so that their presymptomatic relatives can be found and all the affected family members can consider preventive measures to limit their morbidity and mortality.

Dr. Leticia Moreira and her colleagues in the EPICOLON consortium, an international network of colon cancer cohorts, pooled data involving 10,206 probands to show that "universal tumor MMR testing followed by germline testing offers the highest sensitivity," compared with other strategies, "although the increase in the diagnostic yield is modest" (JAMA 2012;308:1555-65).

This research highlights the limitations of various methods for identifying the subgroup of colorectal cancer patients who have Lynch syndrome and should remind clinicians "that simply asking about a family history of CRC in a first-degree relative will miss the majority of patients with Lynch syndrome," because only 43% of patients with Lynch syndrome have such a family history, wrote Dr. Uri Ladabaum and Dr. James M. Ford of the gastrointestinal cancer prevention program and clinical cancer genetics program at Stanford (Calif.) University in an editorial accompanying the study (JAMA 2012;308:1581-3).

"The majority of CRC patients do not have Lynch syndrome. But in the haystack of patients with CRC, those with Lynch syndrome are more like large knitting needles than tiny sewing needles--and a systematic search can find them," they wrote.

Lynch syndrome, also known as hereditary nonpolyposis colorectal cancer, is the most common form of familial colorectal cancer and is caused by germline mutations in the DNA mismatch repair (MMR) genes MSH2, MLH1, MSH6, and PMS2. Dysfunction of these genes causes an accumulation of errors during DNA replication, particularly in the repetitive sequences called microsatellites.

"As a result, tumors of patients with Lynch syndrome characteristically demonstrate MMR deficiency, defined as the presence of microsatellite instability or loss of the MMR protein expression, which is the hallmark of this disorder," the investigators wrote.

Several sets of guidelines have been issued for identifying which colorectal cancer patients should undergo tumor DNA testing to reveal these traits, but none have proved sensitive and specific enough to do an optimal job, and all have been difficult to apply in clinical practice, said Dr. Moreira of the University of Barcelona and her associates.

"Unless there is strong clinical suspicion, the majority of cases remain undetected, leading to the lack of implementation of highly effective preventive measures" including intensive screening by colonoscopy and prophylactic removal of targeted organs, they noted.

By pooling the data from four large cohorts of colorectal cancer patients around the world, the researchers compared the effectiveness of different screening strategies for identifying Lynch syndrome. Universal testing of the tumors for DNA MMR abnormalities was the most effective method, with a sensitivity of 100% and a specificity of 93%.

In comparison, use of the revised Bethesda guidelines will fail to detect approximately 12% of cases, use of the Jerusalem recommendations will fail to detect approximately 15%, and use of a "selective strategy" will fail to detect approximately 5%, the investigators reported. The specificities of the strategies ranged from 95.5% with the "selective strategy" to 97.5% with the revised Bethesda guidelines. The investigators developed the "selective strategy" for screening by using the significant predictors of Lynch syndrome identified in a multivariate analysis: colorectal cancer diagnosed at 60 years or younger, one or more first-degree relatives with CRC diagnosed at 50 years or younger, or personal history of metachronous Lynch syndrome–related tumors diagnosed at 50 years or younger.

The diagnostic yield of universal MMR testing followed by germline testing was 2.2%. In comparison, the diagnostic yield of the revised Bethesda guidelines was 2.0%, that of the Jerusalem recommendations was 1.9%, and that of "selective criteria" was 2.1%.

This study was supported by the Ministerio de Economia y Competividad; the Agnecia de Gestio-d’Ajuts Universitaris i de Recerca; the Asociacion Espaniola contra el Cancer; the Hospital Clinic of Barcelona; and the Instituto de Salud Carlos III. No financial conflicts of interest were reported. Dr. Ladabaum reported ties to Given Imaging, GE Healthcare, Abbott Molecular, Quest Diagnostics, RA Capital, Roche, Vaxart, Endosphere, and Epigenomics. Dr. Ford reported having consulted for Bristol-Myers Squibb.

Click for Credit Link
Author and Disclosure Information

Publications
Topics
Legacy Keywords
Lynch syndrome detection, Lynch syndrome tumors, Lynch syndrome cancer, colorectal cancer testing, DNA mismatch repair genes
Click for Credit Link
Click for Credit Link
Author and Disclosure Information

Author and Disclosure Information

Universal testing of colorectal cancers for DNA mismatch repair genes is the strategy most likely to identify the approximately 1%-3% of patients who have Lynch syndrome, according to a report in the Oct. 17 issue of JAMA.

Correctly identifying the small subgroup of colorectal cancer (CRC) patients who have Lynch syndrome is crucial so that their presymptomatic relatives can be found and all the affected family members can consider preventive measures to limit their morbidity and mortality.

Dr. Leticia Moreira and her colleagues in the EPICOLON consortium, an international network of colon cancer cohorts, pooled data involving 10,206 probands to show that "universal tumor MMR testing followed by germline testing offers the highest sensitivity," compared with other strategies, "although the increase in the diagnostic yield is modest" (JAMA 2012;308:1555-65).

This research highlights the limitations of various methods for identifying the subgroup of colorectal cancer patients who have Lynch syndrome and should remind clinicians "that simply asking about a family history of CRC in a first-degree relative will miss the majority of patients with Lynch syndrome," because only 43% of patients with Lynch syndrome have such a family history, wrote Dr. Uri Ladabaum and Dr. James M. Ford of the gastrointestinal cancer prevention program and clinical cancer genetics program at Stanford (Calif.) University in an editorial accompanying the study (JAMA 2012;308:1581-3).

"The majority of CRC patients do not have Lynch syndrome. But in the haystack of patients with CRC, those with Lynch syndrome are more like large knitting needles than tiny sewing needles--and a systematic search can find them," they wrote.

Lynch syndrome, also known as hereditary nonpolyposis colorectal cancer, is the most common form of familial colorectal cancer and is caused by germline mutations in the DNA mismatch repair (MMR) genes MSH2, MLH1, MSH6, and PMS2. Dysfunction of these genes causes an accumulation of errors during DNA replication, particularly in the repetitive sequences called microsatellites.

"As a result, tumors of patients with Lynch syndrome characteristically demonstrate MMR deficiency, defined as the presence of microsatellite instability or loss of the MMR protein expression, which is the hallmark of this disorder," the investigators wrote.

Several sets of guidelines have been issued for identifying which colorectal cancer patients should undergo tumor DNA testing to reveal these traits, but none have proved sensitive and specific enough to do an optimal job, and all have been difficult to apply in clinical practice, said Dr. Moreira of the University of Barcelona and her associates.

"Unless there is strong clinical suspicion, the majority of cases remain undetected, leading to the lack of implementation of highly effective preventive measures" including intensive screening by colonoscopy and prophylactic removal of targeted organs, they noted.

By pooling the data from four large cohorts of colorectal cancer patients around the world, the researchers compared the effectiveness of different screening strategies for identifying Lynch syndrome. Universal testing of the tumors for DNA MMR abnormalities was the most effective method, with a sensitivity of 100% and a specificity of 93%.

In comparison, use of the revised Bethesda guidelines will fail to detect approximately 12% of cases, use of the Jerusalem recommendations will fail to detect approximately 15%, and use of a "selective strategy" will fail to detect approximately 5%, the investigators reported. The specificities of the strategies ranged from 95.5% with the "selective strategy" to 97.5% with the revised Bethesda guidelines. The investigators developed the "selective strategy" for screening by using the significant predictors of Lynch syndrome identified in a multivariate analysis: colorectal cancer diagnosed at 60 years or younger, one or more first-degree relatives with CRC diagnosed at 50 years or younger, or personal history of metachronous Lynch syndrome–related tumors diagnosed at 50 years or younger.

The diagnostic yield of universal MMR testing followed by germline testing was 2.2%. In comparison, the diagnostic yield of the revised Bethesda guidelines was 2.0%, that of the Jerusalem recommendations was 1.9%, and that of "selective criteria" was 2.1%.

This study was supported by the Ministerio de Economia y Competividad; the Agnecia de Gestio-d’Ajuts Universitaris i de Recerca; the Asociacion Espaniola contra el Cancer; the Hospital Clinic of Barcelona; and the Instituto de Salud Carlos III. No financial conflicts of interest were reported. Dr. Ladabaum reported ties to Given Imaging, GE Healthcare, Abbott Molecular, Quest Diagnostics, RA Capital, Roche, Vaxart, Endosphere, and Epigenomics. Dr. Ford reported having consulted for Bristol-Myers Squibb.

Universal testing of colorectal cancers for DNA mismatch repair genes is the strategy most likely to identify the approximately 1%-3% of patients who have Lynch syndrome, according to a report in the Oct. 17 issue of JAMA.

Correctly identifying the small subgroup of colorectal cancer (CRC) patients who have Lynch syndrome is crucial so that their presymptomatic relatives can be found and all the affected family members can consider preventive measures to limit their morbidity and mortality.

Dr. Leticia Moreira and her colleagues in the EPICOLON consortium, an international network of colon cancer cohorts, pooled data involving 10,206 probands to show that "universal tumor MMR testing followed by germline testing offers the highest sensitivity," compared with other strategies, "although the increase in the diagnostic yield is modest" (JAMA 2012;308:1555-65).

This research highlights the limitations of various methods for identifying the subgroup of colorectal cancer patients who have Lynch syndrome and should remind clinicians "that simply asking about a family history of CRC in a first-degree relative will miss the majority of patients with Lynch syndrome," because only 43% of patients with Lynch syndrome have such a family history, wrote Dr. Uri Ladabaum and Dr. James M. Ford of the gastrointestinal cancer prevention program and clinical cancer genetics program at Stanford (Calif.) University in an editorial accompanying the study (JAMA 2012;308:1581-3).

"The majority of CRC patients do not have Lynch syndrome. But in the haystack of patients with CRC, those with Lynch syndrome are more like large knitting needles than tiny sewing needles--and a systematic search can find them," they wrote.

Lynch syndrome, also known as hereditary nonpolyposis colorectal cancer, is the most common form of familial colorectal cancer and is caused by germline mutations in the DNA mismatch repair (MMR) genes MSH2, MLH1, MSH6, and PMS2. Dysfunction of these genes causes an accumulation of errors during DNA replication, particularly in the repetitive sequences called microsatellites.

"As a result, tumors of patients with Lynch syndrome characteristically demonstrate MMR deficiency, defined as the presence of microsatellite instability or loss of the MMR protein expression, which is the hallmark of this disorder," the investigators wrote.

Several sets of guidelines have been issued for identifying which colorectal cancer patients should undergo tumor DNA testing to reveal these traits, but none have proved sensitive and specific enough to do an optimal job, and all have been difficult to apply in clinical practice, said Dr. Moreira of the University of Barcelona and her associates.

"Unless there is strong clinical suspicion, the majority of cases remain undetected, leading to the lack of implementation of highly effective preventive measures" including intensive screening by colonoscopy and prophylactic removal of targeted organs, they noted.

By pooling the data from four large cohorts of colorectal cancer patients around the world, the researchers compared the effectiveness of different screening strategies for identifying Lynch syndrome. Universal testing of the tumors for DNA MMR abnormalities was the most effective method, with a sensitivity of 100% and a specificity of 93%.

In comparison, use of the revised Bethesda guidelines will fail to detect approximately 12% of cases, use of the Jerusalem recommendations will fail to detect approximately 15%, and use of a "selective strategy" will fail to detect approximately 5%, the investigators reported. The specificities of the strategies ranged from 95.5% with the "selective strategy" to 97.5% with the revised Bethesda guidelines. The investigators developed the "selective strategy" for screening by using the significant predictors of Lynch syndrome identified in a multivariate analysis: colorectal cancer diagnosed at 60 years or younger, one or more first-degree relatives with CRC diagnosed at 50 years or younger, or personal history of metachronous Lynch syndrome–related tumors diagnosed at 50 years or younger.

The diagnostic yield of universal MMR testing followed by germline testing was 2.2%. In comparison, the diagnostic yield of the revised Bethesda guidelines was 2.0%, that of the Jerusalem recommendations was 1.9%, and that of "selective criteria" was 2.1%.

This study was supported by the Ministerio de Economia y Competividad; the Agnecia de Gestio-d’Ajuts Universitaris i de Recerca; the Asociacion Espaniola contra el Cancer; the Hospital Clinic of Barcelona; and the Instituto de Salud Carlos III. No financial conflicts of interest were reported. Dr. Ladabaum reported ties to Given Imaging, GE Healthcare, Abbott Molecular, Quest Diagnostics, RA Capital, Roche, Vaxart, Endosphere, and Epigenomics. Dr. Ford reported having consulted for Bristol-Myers Squibb.

Publications
Publications
Topics
Article Type
Display Headline
Screening All Colorectal Tumors Detects Lynch Syndrome Best
Display Headline
Screening All Colorectal Tumors Detects Lynch Syndrome Best
Legacy Keywords
Lynch syndrome detection, Lynch syndrome tumors, Lynch syndrome cancer, colorectal cancer testing, DNA mismatch repair genes
Legacy Keywords
Lynch syndrome detection, Lynch syndrome tumors, Lynch syndrome cancer, colorectal cancer testing, DNA mismatch repair genes
Article Source

FROM JAMA

PURLs Copyright

Inside the Article

Vitals

Major Finding: Universal screening of colorectal cancers’ DNA for MMR abnormalities had a sensitivity of 100% and a specificity of 93% in identifying Lynch syndrome.

Data Source: A pooled data analysis of screening strategies in 10,206 probands with colorectal cancer, of whom 1,386 (14%) proved to have the tumor MMR deficiency characteristic of Lynch syndrome.

Disclosures: This study was supported by the Ministerio de Economia y Competividad, the Agnecia de Gestio-d’Ajuts Universitaris i de Recerca, the Asociacion Espaniola contra el Cancer, the Hospital Clinic of Barcelona, and the Instituto de Salud Carlos III. No financial conflicts of interest were reported. Dr. Ladabaum reported ties to Given Imaging, GE Healthcare, Abbott Molecular, Quest Diagnostics, RA Capital, Roche, Vaxart, Endosphere, and Epigenomics. Dr. Ford reported having consulted for Bristol-Myers Squibb.