User login
Analysis could help to predict West Nile epidemics
Analyzing three local factors helps to predict when an epidemic of West Nile disease is imminent or has just begun, according to a report in the July 17 issue of JAMA.
This in turn should allow augmented mosquito control efforts and other measures such as public education to limit human illness from the virus just as an epidemic commences, said Dr. Wendy M. Chung of Dallas County Health and Human Services and her associates.
By studying these three factors – local weather patterns, the geographical distribution of recent outbreaks, and the current mosquito vector index (an estimate of the average number of virus-infected mosquitoes collected in surveillance traps each night) – public health specialists may be able to identify rapid rises in West Nile activity before it is too late to intervene. Until now, the best that such experts could do was to "wait to initiate augmented vector control until significant numbers of human cases and deaths [had been] reported."
This wait has allowed epidemics to take root, since the incubation period is 1-2 weeks between mosquito bite and symptom onset, and since there is a further 1- to 2-week delay for viral cultures to be completed and positive results to be reported to authorities, the investigators said.
Dr. Chung and her colleagues came to these conclusions after analyzing data from the 2012 resurgence in West Nile virus that occurred nationwide but was particularly damaging in the Dallas County area. "Dallas has been a known focus of mosquito-borne encephalitis since 1966," they noted, and had an ongoing surveillance program that included collecting mosquitos in traps.
Before the 2012 epidemic there, Dallas had enjoyed 5 years of relative quietude in West Nile virus activity.
Then, between May and early December 2012, 1,162 cases of West Nile viremia were reported in Dallas County; 398 cases of viral illness were confirmed, including 19 fatal cases. This "record-setting outbreak" began a month earlier than previous West Nile seasons had, and the number of new cases also escalated more rapidly than in previous seasons.
An additional 17 cases of West Nile viremia were identified among blood donors, which was more than twice the number of viremic blood donors identified during the previous outbreak. The overall incidence of West Nile disease in Dallas County was 7.30 per 100,000 residents in 2012, more than twice as high as the previous peak incidence of 2.91 per 100,000.
The demographic and clinical characteristics of cases in the 2012 outbreak were similar to those in previous outbreaks. In 2012, 96% required hospitalization, 35% intensive care, and 18% assisted ventilation. The case fatality rate was 10%.
An analysis of local weather patterns revealed that the winter preceding the 2012 epidemic was the mildest in a decade, with no hard winter freezes, a record peak in the number of days with above-normal temperatures, and a record peak in the amount of winter rains. Summer temperatures also were warmer than average that year, and there was less wind during the months that usually are windy.
These findings suggest that West Nile activity increases when extreme weather conditions favor mosquito survival over the winter, a longer period of mosquito-to-bird transmission in early spring, and an early start to human infections, Dr. Chung and her associates said (JAMA 2013;310:297-307).
An analysis of the geographic distribution of cases over time revealed a "repeated predilection of cases" in the northern region of the county, with a particular hot spot in a defined north-central location. "The census tracts in this high-risk hot spot were distinguished from those in other areas by higher property values, greater housing density, and a higher percentage of houses unoccupied (reflecting the current economic downturn)," the researchers wrote.
These findings concur with those of previous studies in metropolitan areas. Such neighborhoods have more neglected swimming pools, which increase mosquito populations, and less forestation, which allows greater viral amplification among birds. "Whatever the biological explanation, identifying a perennial geographical pattern of human infections should be useful in targeting such areas for more intensive public health prevention measures, including preseason source reduction, larviciding, and education," Dr. Chung and her associates wrote.
Finally, mosquito surveillance and screening for West Nile virus carriage allowed the calculation of a species-specific vector index: an estimate of the average number of infected mosquitoes caught per trap-night. The weekly vector index in 2012 first detected West Nile in May, a full month earlier than in previous seasons.
The 2012 vector index also began increasing earlier and reached a higher peak than in previous seasons. "Sequential increases in the weekly vector index early in the 2012 season significantly predicted the number of patients with onset of symptoms of West Nile disease in the subsequent 1 to 2 weeks," the investigators said.
For predicting human illness, the vector index was superior to "other entomologic risk measures" such as mosquito abundance or mosquito infection rates, they added.
In the 2012 epidemic, the vector index passed a threshold of 0.5 by the end of June, at which time only three cases of West Nile disease had been reported. Taking action at this point instead of waiting for a sufficient number of cases to be reported would have averted much morbidity and mortality.
The researchers also addressed the concern that augmented spraying of insecticide at the onset of an outbreak of West Nile would harm residents. They analyzed the daily incidence of emergency department visits for skin rashes and acute upper respiratory distress and found no increase in these conditions during the 8-day period of ultralow-volume aerial spraying of minimally toxic pyrethroid insecticides approved by the Environmental Protection Agency for this purpose.
No financial conflicts of interest were reported.
Changing weather patterns, including increasingly mild winters and warmer summers that favor West Nile virus transmission, are becoming more common, said Dr. Stephen M. Ostroff.
Global climate change thus may expand both the zones of risk for West Nile disease as well as the duration of the typical disease "season."
The study by Dr. Chung and her colleagues highlights the importance of maintaining strong vector surveillance and management programs, rather than cutting their funding as many local, state, and federal sources are currently doing, he noted.
Dr. Ostroff is formerly of the Centers for Disease Control and Prevention, Atlanta, and the Pennsylvania Department of Health, Harrisburg. He reported no financial conflicts of interest. These remarks were taken from his editorial accompanying Dr. Chung’s report (JAMA 2013;310:267-8).
Changing weather patterns, including increasingly mild winters and warmer summers that favor West Nile virus transmission, are becoming more common, said Dr. Stephen M. Ostroff.
Global climate change thus may expand both the zones of risk for West Nile disease as well as the duration of the typical disease "season."
The study by Dr. Chung and her colleagues highlights the importance of maintaining strong vector surveillance and management programs, rather than cutting their funding as many local, state, and federal sources are currently doing, he noted.
Dr. Ostroff is formerly of the Centers for Disease Control and Prevention, Atlanta, and the Pennsylvania Department of Health, Harrisburg. He reported no financial conflicts of interest. These remarks were taken from his editorial accompanying Dr. Chung’s report (JAMA 2013;310:267-8).
Changing weather patterns, including increasingly mild winters and warmer summers that favor West Nile virus transmission, are becoming more common, said Dr. Stephen M. Ostroff.
Global climate change thus may expand both the zones of risk for West Nile disease as well as the duration of the typical disease "season."
The study by Dr. Chung and her colleagues highlights the importance of maintaining strong vector surveillance and management programs, rather than cutting their funding as many local, state, and federal sources are currently doing, he noted.
Dr. Ostroff is formerly of the Centers for Disease Control and Prevention, Atlanta, and the Pennsylvania Department of Health, Harrisburg. He reported no financial conflicts of interest. These remarks were taken from his editorial accompanying Dr. Chung’s report (JAMA 2013;310:267-8).
Analyzing three local factors helps to predict when an epidemic of West Nile disease is imminent or has just begun, according to a report in the July 17 issue of JAMA.
This in turn should allow augmented mosquito control efforts and other measures such as public education to limit human illness from the virus just as an epidemic commences, said Dr. Wendy M. Chung of Dallas County Health and Human Services and her associates.
By studying these three factors – local weather patterns, the geographical distribution of recent outbreaks, and the current mosquito vector index (an estimate of the average number of virus-infected mosquitoes collected in surveillance traps each night) – public health specialists may be able to identify rapid rises in West Nile activity before it is too late to intervene. Until now, the best that such experts could do was to "wait to initiate augmented vector control until significant numbers of human cases and deaths [had been] reported."
This wait has allowed epidemics to take root, since the incubation period is 1-2 weeks between mosquito bite and symptom onset, and since there is a further 1- to 2-week delay for viral cultures to be completed and positive results to be reported to authorities, the investigators said.
Dr. Chung and her colleagues came to these conclusions after analyzing data from the 2012 resurgence in West Nile virus that occurred nationwide but was particularly damaging in the Dallas County area. "Dallas has been a known focus of mosquito-borne encephalitis since 1966," they noted, and had an ongoing surveillance program that included collecting mosquitos in traps.
Before the 2012 epidemic there, Dallas had enjoyed 5 years of relative quietude in West Nile virus activity.
Then, between May and early December 2012, 1,162 cases of West Nile viremia were reported in Dallas County; 398 cases of viral illness were confirmed, including 19 fatal cases. This "record-setting outbreak" began a month earlier than previous West Nile seasons had, and the number of new cases also escalated more rapidly than in previous seasons.
An additional 17 cases of West Nile viremia were identified among blood donors, which was more than twice the number of viremic blood donors identified during the previous outbreak. The overall incidence of West Nile disease in Dallas County was 7.30 per 100,000 residents in 2012, more than twice as high as the previous peak incidence of 2.91 per 100,000.
The demographic and clinical characteristics of cases in the 2012 outbreak were similar to those in previous outbreaks. In 2012, 96% required hospitalization, 35% intensive care, and 18% assisted ventilation. The case fatality rate was 10%.
An analysis of local weather patterns revealed that the winter preceding the 2012 epidemic was the mildest in a decade, with no hard winter freezes, a record peak in the number of days with above-normal temperatures, and a record peak in the amount of winter rains. Summer temperatures also were warmer than average that year, and there was less wind during the months that usually are windy.
These findings suggest that West Nile activity increases when extreme weather conditions favor mosquito survival over the winter, a longer period of mosquito-to-bird transmission in early spring, and an early start to human infections, Dr. Chung and her associates said (JAMA 2013;310:297-307).
An analysis of the geographic distribution of cases over time revealed a "repeated predilection of cases" in the northern region of the county, with a particular hot spot in a defined north-central location. "The census tracts in this high-risk hot spot were distinguished from those in other areas by higher property values, greater housing density, and a higher percentage of houses unoccupied (reflecting the current economic downturn)," the researchers wrote.
These findings concur with those of previous studies in metropolitan areas. Such neighborhoods have more neglected swimming pools, which increase mosquito populations, and less forestation, which allows greater viral amplification among birds. "Whatever the biological explanation, identifying a perennial geographical pattern of human infections should be useful in targeting such areas for more intensive public health prevention measures, including preseason source reduction, larviciding, and education," Dr. Chung and her associates wrote.
Finally, mosquito surveillance and screening for West Nile virus carriage allowed the calculation of a species-specific vector index: an estimate of the average number of infected mosquitoes caught per trap-night. The weekly vector index in 2012 first detected West Nile in May, a full month earlier than in previous seasons.
The 2012 vector index also began increasing earlier and reached a higher peak than in previous seasons. "Sequential increases in the weekly vector index early in the 2012 season significantly predicted the number of patients with onset of symptoms of West Nile disease in the subsequent 1 to 2 weeks," the investigators said.
For predicting human illness, the vector index was superior to "other entomologic risk measures" such as mosquito abundance or mosquito infection rates, they added.
In the 2012 epidemic, the vector index passed a threshold of 0.5 by the end of June, at which time only three cases of West Nile disease had been reported. Taking action at this point instead of waiting for a sufficient number of cases to be reported would have averted much morbidity and mortality.
The researchers also addressed the concern that augmented spraying of insecticide at the onset of an outbreak of West Nile would harm residents. They analyzed the daily incidence of emergency department visits for skin rashes and acute upper respiratory distress and found no increase in these conditions during the 8-day period of ultralow-volume aerial spraying of minimally toxic pyrethroid insecticides approved by the Environmental Protection Agency for this purpose.
No financial conflicts of interest were reported.
Analyzing three local factors helps to predict when an epidemic of West Nile disease is imminent or has just begun, according to a report in the July 17 issue of JAMA.
This in turn should allow augmented mosquito control efforts and other measures such as public education to limit human illness from the virus just as an epidemic commences, said Dr. Wendy M. Chung of Dallas County Health and Human Services and her associates.
By studying these three factors – local weather patterns, the geographical distribution of recent outbreaks, and the current mosquito vector index (an estimate of the average number of virus-infected mosquitoes collected in surveillance traps each night) – public health specialists may be able to identify rapid rises in West Nile activity before it is too late to intervene. Until now, the best that such experts could do was to "wait to initiate augmented vector control until significant numbers of human cases and deaths [had been] reported."
This wait has allowed epidemics to take root, since the incubation period is 1-2 weeks between mosquito bite and symptom onset, and since there is a further 1- to 2-week delay for viral cultures to be completed and positive results to be reported to authorities, the investigators said.
Dr. Chung and her colleagues came to these conclusions after analyzing data from the 2012 resurgence in West Nile virus that occurred nationwide but was particularly damaging in the Dallas County area. "Dallas has been a known focus of mosquito-borne encephalitis since 1966," they noted, and had an ongoing surveillance program that included collecting mosquitos in traps.
Before the 2012 epidemic there, Dallas had enjoyed 5 years of relative quietude in West Nile virus activity.
Then, between May and early December 2012, 1,162 cases of West Nile viremia were reported in Dallas County; 398 cases of viral illness were confirmed, including 19 fatal cases. This "record-setting outbreak" began a month earlier than previous West Nile seasons had, and the number of new cases also escalated more rapidly than in previous seasons.
An additional 17 cases of West Nile viremia were identified among blood donors, which was more than twice the number of viremic blood donors identified during the previous outbreak. The overall incidence of West Nile disease in Dallas County was 7.30 per 100,000 residents in 2012, more than twice as high as the previous peak incidence of 2.91 per 100,000.
The demographic and clinical characteristics of cases in the 2012 outbreak were similar to those in previous outbreaks. In 2012, 96% required hospitalization, 35% intensive care, and 18% assisted ventilation. The case fatality rate was 10%.
An analysis of local weather patterns revealed that the winter preceding the 2012 epidemic was the mildest in a decade, with no hard winter freezes, a record peak in the number of days with above-normal temperatures, and a record peak in the amount of winter rains. Summer temperatures also were warmer than average that year, and there was less wind during the months that usually are windy.
These findings suggest that West Nile activity increases when extreme weather conditions favor mosquito survival over the winter, a longer period of mosquito-to-bird transmission in early spring, and an early start to human infections, Dr. Chung and her associates said (JAMA 2013;310:297-307).
An analysis of the geographic distribution of cases over time revealed a "repeated predilection of cases" in the northern region of the county, with a particular hot spot in a defined north-central location. "The census tracts in this high-risk hot spot were distinguished from those in other areas by higher property values, greater housing density, and a higher percentage of houses unoccupied (reflecting the current economic downturn)," the researchers wrote.
These findings concur with those of previous studies in metropolitan areas. Such neighborhoods have more neglected swimming pools, which increase mosquito populations, and less forestation, which allows greater viral amplification among birds. "Whatever the biological explanation, identifying a perennial geographical pattern of human infections should be useful in targeting such areas for more intensive public health prevention measures, including preseason source reduction, larviciding, and education," Dr. Chung and her associates wrote.
Finally, mosquito surveillance and screening for West Nile virus carriage allowed the calculation of a species-specific vector index: an estimate of the average number of infected mosquitoes caught per trap-night. The weekly vector index in 2012 first detected West Nile in May, a full month earlier than in previous seasons.
The 2012 vector index also began increasing earlier and reached a higher peak than in previous seasons. "Sequential increases in the weekly vector index early in the 2012 season significantly predicted the number of patients with onset of symptoms of West Nile disease in the subsequent 1 to 2 weeks," the investigators said.
For predicting human illness, the vector index was superior to "other entomologic risk measures" such as mosquito abundance or mosquito infection rates, they added.
In the 2012 epidemic, the vector index passed a threshold of 0.5 by the end of June, at which time only three cases of West Nile disease had been reported. Taking action at this point instead of waiting for a sufficient number of cases to be reported would have averted much morbidity and mortality.
The researchers also addressed the concern that augmented spraying of insecticide at the onset of an outbreak of West Nile would harm residents. They analyzed the daily incidence of emergency department visits for skin rashes and acute upper respiratory distress and found no increase in these conditions during the 8-day period of ultralow-volume aerial spraying of minimally toxic pyrethroid insecticides approved by the Environmental Protection Agency for this purpose.
No financial conflicts of interest were reported.
FROM JAMA
Major Finding: The overall West Nile neuroinvasive disease incidence rate in Dallas County was 7.30 per 100,000 residents in 2012, compared with 2.91 in 2006, the year of the second-largest outbreak in the county.
Data Source: An analysis of epidemiologic, meteorologic, and geospatial data collected during the 2012 West Nile virus epidemic in Dallas County.
Disclosures: No financial conflicts of interest were reported.
Longer duration of obesity linked to coronary calcification
Longer duration of both overall and abdominal obesity is strongly associated with subclinical coronary heart disease at midlife, as well as with increased progression of that disease over the course of 10 years, according to an analysis of the CARDIA study. The results were published in the July 17 issue of JAMA.
These associations are independent of the degree of adiposity, meaning that any level of overall or abdominal obesity corresponds with increased coronary risk, said Jared P. Reis, Ph.D., of the National Heart, Lung, and Blood Institute, Bethesda, Md., and his associates.
"Each additional year of overall or abdominal obesity beginning in early adulthood was associated with an HR or OR of 1.02 to 1.04 for coronary artery calcification and its progression" in middle age, they noted.
"Our findings suggest that preventing or at least delaying the onset of obesity in young adulthood may substantially reduce the risk of coronary atherosclerosis and limit its progression in later life," Dr. Reis and his colleagues said.
The investigators examined this issue using data from the CARDIA (Coronary Artery Risk Development in Young Adults) study, a multicenter, community-based, longitudinal cohort assessing the development and the determinants of cardiovascular disease over time. The study comprised 5,115 young adults aged 18-30 years at baseline in 1985-1986 who resided in Birmingham, Ala.; Chicago; Minneapolis; and Oakland, Calif. These subjects now have been reexamined at 2, 5, 7, 10, 15, 20, and 25 years after baseline.
The presence and degree of coronary artery calcification was measured using chest CT at year 15 (2000-2001), year 20 (2005-2006), and/or year 25 (2010-2011).
For their study, Dr. Reis and his associates focused on 3,275 of these CARDIA participants who were not obese at baseline. Roughly 46% were black and 51% were women.
A total of 40.4% of their study subjects developed overall obesity and 41.0% developed abdominal obesity during follow-up, with significant overlap in these two categories. The mean age at onset of overall obesity was 35.4 years, and mean age at onset of abdominal obesity was 37.7 years. The mean duration of overall obesity was 13 years, and that of abdominal obesity was 12 years.
Subclinical coronary artery calcification was identified in 27.5% of the 3,275 study subjects overall.
A total of 38.2% of subjects who had overall obesity for more than 20 years were found to have coronary artery calcification, as were 39.3% of those who had abdominal obesity for more than 20 years. In contrast, these rates were 24.9% and 24.7% in nonobese adults, the investigators said (JAMA 2013;310:280-8 [doi:10.1001/jama.2013.7833]).
Similarly, 28.4% of subjects who had overall obesity for more than 20 years were found to have high scores on a measure of coronary artery calcification, as were 28.2% of those who had abdominal obesity for more than 20 years. In contrast, these rates were 15.2% and 15.5% in nonobese adults.
In addition, the rates of coronary artery calcification were higher with increasing duration of obesity. For example, the rate of calcification was 11 per 1,000 person-years in subjects with 0 years of obesity, compared with 16.7 per 1,000 person-years in subjects with more than 20 years of obesity.
Coronary artery calcification also was more likely to progress over the course of 10 years in obese than in nonobese subjects. Rates of progression were 25.2% in adults with more than 20 years of overall obesity and 27.7% in those with more than 20 years of abdominal obesity, compared with 20.2% and 19.5%, respectively, in nonobese adults.
The association between obesity and coronary artery calcification did not differ by subjects’ race or sex.
"These findings suggest that the longer duration of exposure to excess adiposity as a result of the obesity epidemic and an earlier age at onset will have important implications [for] the future burden of coronary atherosclerosis and potentially [for] the rates of clinical cardiovascular disease in the United States," Dr. Reis and his associates said.
They added that although the mechanisms by which prolonged adiposity affects coronary artery calcification are not precisely known, it is likely that the sustained expression and secretion of proinflammatory adipocytokines plays a role. "Extended impairment of the fibrinolytic system via increased markers of hypercoagulability and hypofibrinolysis may also contribute to atherosclerotic vascular disease," the researchers wrote.
Obesity is also thought to impair nitric-oxide-dependent endothelial function, increase oxidative stress, and upregulate vasoconstrictor proteins, all of which may contribute to coronary atherosclerosis, they said.
This study was supported by the National Heart, Lung, and Blood Institute. Dr. Reis reported no financial conflicts of interest; one of his associates reported receiving grants from Novo Nordisk.
Longer duration of both overall and abdominal obesity is strongly associated with subclinical coronary heart disease at midlife, as well as with increased progression of that disease over the course of 10 years, according to an analysis of the CARDIA study. The results were published in the July 17 issue of JAMA.
These associations are independent of the degree of adiposity, meaning that any level of overall or abdominal obesity corresponds with increased coronary risk, said Jared P. Reis, Ph.D., of the National Heart, Lung, and Blood Institute, Bethesda, Md., and his associates.
"Each additional year of overall or abdominal obesity beginning in early adulthood was associated with an HR or OR of 1.02 to 1.04 for coronary artery calcification and its progression" in middle age, they noted.
"Our findings suggest that preventing or at least delaying the onset of obesity in young adulthood may substantially reduce the risk of coronary atherosclerosis and limit its progression in later life," Dr. Reis and his colleagues said.
The investigators examined this issue using data from the CARDIA (Coronary Artery Risk Development in Young Adults) study, a multicenter, community-based, longitudinal cohort assessing the development and the determinants of cardiovascular disease over time. The study comprised 5,115 young adults aged 18-30 years at baseline in 1985-1986 who resided in Birmingham, Ala.; Chicago; Minneapolis; and Oakland, Calif. These subjects now have been reexamined at 2, 5, 7, 10, 15, 20, and 25 years after baseline.
The presence and degree of coronary artery calcification was measured using chest CT at year 15 (2000-2001), year 20 (2005-2006), and/or year 25 (2010-2011).
For their study, Dr. Reis and his associates focused on 3,275 of these CARDIA participants who were not obese at baseline. Roughly 46% were black and 51% were women.
A total of 40.4% of their study subjects developed overall obesity and 41.0% developed abdominal obesity during follow-up, with significant overlap in these two categories. The mean age at onset of overall obesity was 35.4 years, and mean age at onset of abdominal obesity was 37.7 years. The mean duration of overall obesity was 13 years, and that of abdominal obesity was 12 years.
Subclinical coronary artery calcification was identified in 27.5% of the 3,275 study subjects overall.
A total of 38.2% of subjects who had overall obesity for more than 20 years were found to have coronary artery calcification, as were 39.3% of those who had abdominal obesity for more than 20 years. In contrast, these rates were 24.9% and 24.7% in nonobese adults, the investigators said (JAMA 2013;310:280-8 [doi:10.1001/jama.2013.7833]).
Similarly, 28.4% of subjects who had overall obesity for more than 20 years were found to have high scores on a measure of coronary artery calcification, as were 28.2% of those who had abdominal obesity for more than 20 years. In contrast, these rates were 15.2% and 15.5% in nonobese adults.
In addition, the rates of coronary artery calcification were higher with increasing duration of obesity. For example, the rate of calcification was 11 per 1,000 person-years in subjects with 0 years of obesity, compared with 16.7 per 1,000 person-years in subjects with more than 20 years of obesity.
Coronary artery calcification also was more likely to progress over the course of 10 years in obese than in nonobese subjects. Rates of progression were 25.2% in adults with more than 20 years of overall obesity and 27.7% in those with more than 20 years of abdominal obesity, compared with 20.2% and 19.5%, respectively, in nonobese adults.
The association between obesity and coronary artery calcification did not differ by subjects’ race or sex.
"These findings suggest that the longer duration of exposure to excess adiposity as a result of the obesity epidemic and an earlier age at onset will have important implications [for] the future burden of coronary atherosclerosis and potentially [for] the rates of clinical cardiovascular disease in the United States," Dr. Reis and his associates said.
They added that although the mechanisms by which prolonged adiposity affects coronary artery calcification are not precisely known, it is likely that the sustained expression and secretion of proinflammatory adipocytokines plays a role. "Extended impairment of the fibrinolytic system via increased markers of hypercoagulability and hypofibrinolysis may also contribute to atherosclerotic vascular disease," the researchers wrote.
Obesity is also thought to impair nitric-oxide-dependent endothelial function, increase oxidative stress, and upregulate vasoconstrictor proteins, all of which may contribute to coronary atherosclerosis, they said.
This study was supported by the National Heart, Lung, and Blood Institute. Dr. Reis reported no financial conflicts of interest; one of his associates reported receiving grants from Novo Nordisk.
Longer duration of both overall and abdominal obesity is strongly associated with subclinical coronary heart disease at midlife, as well as with increased progression of that disease over the course of 10 years, according to an analysis of the CARDIA study. The results were published in the July 17 issue of JAMA.
These associations are independent of the degree of adiposity, meaning that any level of overall or abdominal obesity corresponds with increased coronary risk, said Jared P. Reis, Ph.D., of the National Heart, Lung, and Blood Institute, Bethesda, Md., and his associates.
"Each additional year of overall or abdominal obesity beginning in early adulthood was associated with an HR or OR of 1.02 to 1.04 for coronary artery calcification and its progression" in middle age, they noted.
"Our findings suggest that preventing or at least delaying the onset of obesity in young adulthood may substantially reduce the risk of coronary atherosclerosis and limit its progression in later life," Dr. Reis and his colleagues said.
The investigators examined this issue using data from the CARDIA (Coronary Artery Risk Development in Young Adults) study, a multicenter, community-based, longitudinal cohort assessing the development and the determinants of cardiovascular disease over time. The study comprised 5,115 young adults aged 18-30 years at baseline in 1985-1986 who resided in Birmingham, Ala.; Chicago; Minneapolis; and Oakland, Calif. These subjects now have been reexamined at 2, 5, 7, 10, 15, 20, and 25 years after baseline.
The presence and degree of coronary artery calcification was measured using chest CT at year 15 (2000-2001), year 20 (2005-2006), and/or year 25 (2010-2011).
For their study, Dr. Reis and his associates focused on 3,275 of these CARDIA participants who were not obese at baseline. Roughly 46% were black and 51% were women.
A total of 40.4% of their study subjects developed overall obesity and 41.0% developed abdominal obesity during follow-up, with significant overlap in these two categories. The mean age at onset of overall obesity was 35.4 years, and mean age at onset of abdominal obesity was 37.7 years. The mean duration of overall obesity was 13 years, and that of abdominal obesity was 12 years.
Subclinical coronary artery calcification was identified in 27.5% of the 3,275 study subjects overall.
A total of 38.2% of subjects who had overall obesity for more than 20 years were found to have coronary artery calcification, as were 39.3% of those who had abdominal obesity for more than 20 years. In contrast, these rates were 24.9% and 24.7% in nonobese adults, the investigators said (JAMA 2013;310:280-8 [doi:10.1001/jama.2013.7833]).
Similarly, 28.4% of subjects who had overall obesity for more than 20 years were found to have high scores on a measure of coronary artery calcification, as were 28.2% of those who had abdominal obesity for more than 20 years. In contrast, these rates were 15.2% and 15.5% in nonobese adults.
In addition, the rates of coronary artery calcification were higher with increasing duration of obesity. For example, the rate of calcification was 11 per 1,000 person-years in subjects with 0 years of obesity, compared with 16.7 per 1,000 person-years in subjects with more than 20 years of obesity.
Coronary artery calcification also was more likely to progress over the course of 10 years in obese than in nonobese subjects. Rates of progression were 25.2% in adults with more than 20 years of overall obesity and 27.7% in those with more than 20 years of abdominal obesity, compared with 20.2% and 19.5%, respectively, in nonobese adults.
The association between obesity and coronary artery calcification did not differ by subjects’ race or sex.
"These findings suggest that the longer duration of exposure to excess adiposity as a result of the obesity epidemic and an earlier age at onset will have important implications [for] the future burden of coronary atherosclerosis and potentially [for] the rates of clinical cardiovascular disease in the United States," Dr. Reis and his associates said.
They added that although the mechanisms by which prolonged adiposity affects coronary artery calcification are not precisely known, it is likely that the sustained expression and secretion of proinflammatory adipocytokines plays a role. "Extended impairment of the fibrinolytic system via increased markers of hypercoagulability and hypofibrinolysis may also contribute to atherosclerotic vascular disease," the researchers wrote.
Obesity is also thought to impair nitric-oxide-dependent endothelial function, increase oxidative stress, and upregulate vasoconstrictor proteins, all of which may contribute to coronary atherosclerosis, they said.
This study was supported by the National Heart, Lung, and Blood Institute. Dr. Reis reported no financial conflicts of interest; one of his associates reported receiving grants from Novo Nordisk.
FROM JAMA
Major Finding: Among subjects with overall obesity of more than 20 years’ duration, 38.2% were found to have coronary artery calcification, as were 39.3% of those with abdominal obesity of more than 20 years’ duration. These rates were 24.9% and 24.7% in nonobese adults.
Data Source: A secondary analysis of data from a multicenter community-based longitudinal cohort study in 3,275 nonobese young adults who were followed for 25 years.
Disclosures: This study was supported by the National Heart, Lung, and Blood Institute. Dr. Reis reported no financial conflicts of interest; one of his associates reported receiving grants from Novo Nordisk.
Androgen deprivation therapy linked to acute kidney injury
Androgen deprivation therapy was strongly associated with an increased risk of acute kidney injury among men with nonmetastatic prostate cancer, according to a report in the July 17 issue of JAMA.
This elevation in risk varied slightly among different types of androgen deprivation agents, and was strongest with therapies that combine gonadotropin-releasing hormone agonists with oral antiandrogens. That suggests "a possible additive effect ... on both receptor antagonism and reduction of testosterone excretion," said Francesco Lapi, Pharm.D., Ph.D., of the Centre for Clinical Epidemiology, Jewish General Hospital, Montreal, and his associates (JAMA 2013;310:289-96).
The researchers discovered the risk elevation in what they described as the first population-based study to investigate the association between androgen deprivation therapy and acute kidney injury. They performed the study because even though the treatment traditionally has been reserved for advanced disease, it is now used increasingly in patients with earlier stages of prostate cancer.
In addition, the investigators were prompted to examine a possible link because of the high mortality (approximately 50%) associated with acute kidney injury.
"Although only one case report of flutamide-related acute kidney injury has been published to date, androgen deprivation therapy and its hypogonadal effect have well-known consequences consistent with our findings," they noted.
Dr. Lapi and his colleagues used two large databases in the United Kingdom, the Clinical Practice Research Datalink and the Hospital Episodes Statistics database, to identify 10,250 men newly diagnosed as having prostate cancer in 1998-2008 who were 40 years of age or older at diagnosis and were followed for a mean of 4 years. This yielded more than 42,000 person-years of follow-up.
A total of 232 cases of acute kidney injury occurred, for an overall incidence of 5.5/1,000 person-years, said Dr. Lapi and his associates.
These cases were matched for age, year of diagnosis, and duration of follow-up with 2,721 control subjects who did not develop acute kidney injury.
Compared with control subjects, men who were using androgen deprivation therapy had a significantly increased risk of acute kidney injury, with an odds ratio of 2.48. That association did not change when the data were adjusted to account for possible confounders, such as comorbidities known to impair kidney function, medications known to have renal toxicity, the severity of the underlying prostate cancer, and the intensity of other cancer treatments.
The investigators then analyzed the data according to type of androgen deprivation therapy, dividing the regimens into six mutually exclusive categories: gonadotropin-releasing hormone (GnRH) agonists (leuprolide, goserelin, triptorelin); oral antiandrogens (cyproterone acetate, flutamide, bicalutamide, nilutamide); combined androgen blockade (GnRH agonists plus oral antiandrogens); bilateral orchiectomy; estrogens; and combinations of those.
The odds ratios were highest for combined androgen blockade and also were significantly elevated for other combination therapies. Only the odd ratios for oral antiandrogens alone and for orchiectomy alone failed to reach statistical significance, although both were above 1.0, the investigators said.
The duration of androgen deprivation therapy was examined in a further analysis of the data. The risk of acute kidney injury was highest early in the course of treatment and decreased slightly, but it remained significantly elevated with longer duration of use.
Finally, in a sensitivity analysis that excluded the 54 cases and 842 controls who had abnormal creatinine levels at baseline, the results were consistent with those of the primary analysis.
The mechanism by which androgen deprivation therapy exerts an adverse effect on the kidney is not known, but the treatment is known to raise the risks of the metabolic syndrome and cardiovascular disease. "A similar rationale can be postulated for the risk of acute kidney injury," Dr. Lapi and his associates said.
The dyslipidemia and hyperglycemia of the metabolic syndrome may promote tubular atrophy and interstitial fibrosis, and may impair glomerular function by expanding and thickening the membranes of the interstitial tubules. Both dyslipidemia and hyperglycemia also raise the risk of thrombosis and induce oxidative stress, which can impact renal function.
In addition, testosterone is thought to protect the kidneys by inducing vasodilation in the renal vessels and enhancing nitric oxide production. So, antagonizing testosterone could promote damage to the glomerulus. And the hypogonadism induced by androgen deprivation can also lead to estrogen deficiency, reducing estrogen’s protective effect against ischemic renal injury, the investigators said.
The study was supported by Prostate Cancer Canada, the Canadian Institutes of Health Research, and Fonds de recherche en Sant
Androgen deprivation therapy was strongly associated with an increased risk of acute kidney injury among men with nonmetastatic prostate cancer, according to a report in the July 17 issue of JAMA.
This elevation in risk varied slightly among different types of androgen deprivation agents, and was strongest with therapies that combine gonadotropin-releasing hormone agonists with oral antiandrogens. That suggests "a possible additive effect ... on both receptor antagonism and reduction of testosterone excretion," said Francesco Lapi, Pharm.D., Ph.D., of the Centre for Clinical Epidemiology, Jewish General Hospital, Montreal, and his associates (JAMA 2013;310:289-96).
The researchers discovered the risk elevation in what they described as the first population-based study to investigate the association between androgen deprivation therapy and acute kidney injury. They performed the study because even though the treatment traditionally has been reserved for advanced disease, it is now used increasingly in patients with earlier stages of prostate cancer.
In addition, the investigators were prompted to examine a possible link because of the high mortality (approximately 50%) associated with acute kidney injury.
"Although only one case report of flutamide-related acute kidney injury has been published to date, androgen deprivation therapy and its hypogonadal effect have well-known consequences consistent with our findings," they noted.
Dr. Lapi and his colleagues used two large databases in the United Kingdom, the Clinical Practice Research Datalink and the Hospital Episodes Statistics database, to identify 10,250 men newly diagnosed as having prostate cancer in 1998-2008 who were 40 years of age or older at diagnosis and were followed for a mean of 4 years. This yielded more than 42,000 person-years of follow-up.
A total of 232 cases of acute kidney injury occurred, for an overall incidence of 5.5/1,000 person-years, said Dr. Lapi and his associates.
These cases were matched for age, year of diagnosis, and duration of follow-up with 2,721 control subjects who did not develop acute kidney injury.
Compared with control subjects, men who were using androgen deprivation therapy had a significantly increased risk of acute kidney injury, with an odds ratio of 2.48. That association did not change when the data were adjusted to account for possible confounders, such as comorbidities known to impair kidney function, medications known to have renal toxicity, the severity of the underlying prostate cancer, and the intensity of other cancer treatments.
The investigators then analyzed the data according to type of androgen deprivation therapy, dividing the regimens into six mutually exclusive categories: gonadotropin-releasing hormone (GnRH) agonists (leuprolide, goserelin, triptorelin); oral antiandrogens (cyproterone acetate, flutamide, bicalutamide, nilutamide); combined androgen blockade (GnRH agonists plus oral antiandrogens); bilateral orchiectomy; estrogens; and combinations of those.
The odds ratios were highest for combined androgen blockade and also were significantly elevated for other combination therapies. Only the odd ratios for oral antiandrogens alone and for orchiectomy alone failed to reach statistical significance, although both were above 1.0, the investigators said.
The duration of androgen deprivation therapy was examined in a further analysis of the data. The risk of acute kidney injury was highest early in the course of treatment and decreased slightly, but it remained significantly elevated with longer duration of use.
Finally, in a sensitivity analysis that excluded the 54 cases and 842 controls who had abnormal creatinine levels at baseline, the results were consistent with those of the primary analysis.
The mechanism by which androgen deprivation therapy exerts an adverse effect on the kidney is not known, but the treatment is known to raise the risks of the metabolic syndrome and cardiovascular disease. "A similar rationale can be postulated for the risk of acute kidney injury," Dr. Lapi and his associates said.
The dyslipidemia and hyperglycemia of the metabolic syndrome may promote tubular atrophy and interstitial fibrosis, and may impair glomerular function by expanding and thickening the membranes of the interstitial tubules. Both dyslipidemia and hyperglycemia also raise the risk of thrombosis and induce oxidative stress, which can impact renal function.
In addition, testosterone is thought to protect the kidneys by inducing vasodilation in the renal vessels and enhancing nitric oxide production. So, antagonizing testosterone could promote damage to the glomerulus. And the hypogonadism induced by androgen deprivation can also lead to estrogen deficiency, reducing estrogen’s protective effect against ischemic renal injury, the investigators said.
The study was supported by Prostate Cancer Canada, the Canadian Institutes of Health Research, and Fonds de recherche en Sant
Androgen deprivation therapy was strongly associated with an increased risk of acute kidney injury among men with nonmetastatic prostate cancer, according to a report in the July 17 issue of JAMA.
This elevation in risk varied slightly among different types of androgen deprivation agents, and was strongest with therapies that combine gonadotropin-releasing hormone agonists with oral antiandrogens. That suggests "a possible additive effect ... on both receptor antagonism and reduction of testosterone excretion," said Francesco Lapi, Pharm.D., Ph.D., of the Centre for Clinical Epidemiology, Jewish General Hospital, Montreal, and his associates (JAMA 2013;310:289-96).
The researchers discovered the risk elevation in what they described as the first population-based study to investigate the association between androgen deprivation therapy and acute kidney injury. They performed the study because even though the treatment traditionally has been reserved for advanced disease, it is now used increasingly in patients with earlier stages of prostate cancer.
In addition, the investigators were prompted to examine a possible link because of the high mortality (approximately 50%) associated with acute kidney injury.
"Although only one case report of flutamide-related acute kidney injury has been published to date, androgen deprivation therapy and its hypogonadal effect have well-known consequences consistent with our findings," they noted.
Dr. Lapi and his colleagues used two large databases in the United Kingdom, the Clinical Practice Research Datalink and the Hospital Episodes Statistics database, to identify 10,250 men newly diagnosed as having prostate cancer in 1998-2008 who were 40 years of age or older at diagnosis and were followed for a mean of 4 years. This yielded more than 42,000 person-years of follow-up.
A total of 232 cases of acute kidney injury occurred, for an overall incidence of 5.5/1,000 person-years, said Dr. Lapi and his associates.
These cases were matched for age, year of diagnosis, and duration of follow-up with 2,721 control subjects who did not develop acute kidney injury.
Compared with control subjects, men who were using androgen deprivation therapy had a significantly increased risk of acute kidney injury, with an odds ratio of 2.48. That association did not change when the data were adjusted to account for possible confounders, such as comorbidities known to impair kidney function, medications known to have renal toxicity, the severity of the underlying prostate cancer, and the intensity of other cancer treatments.
The investigators then analyzed the data according to type of androgen deprivation therapy, dividing the regimens into six mutually exclusive categories: gonadotropin-releasing hormone (GnRH) agonists (leuprolide, goserelin, triptorelin); oral antiandrogens (cyproterone acetate, flutamide, bicalutamide, nilutamide); combined androgen blockade (GnRH agonists plus oral antiandrogens); bilateral orchiectomy; estrogens; and combinations of those.
The odds ratios were highest for combined androgen blockade and also were significantly elevated for other combination therapies. Only the odd ratios for oral antiandrogens alone and for orchiectomy alone failed to reach statistical significance, although both were above 1.0, the investigators said.
The duration of androgen deprivation therapy was examined in a further analysis of the data. The risk of acute kidney injury was highest early in the course of treatment and decreased slightly, but it remained significantly elevated with longer duration of use.
Finally, in a sensitivity analysis that excluded the 54 cases and 842 controls who had abnormal creatinine levels at baseline, the results were consistent with those of the primary analysis.
The mechanism by which androgen deprivation therapy exerts an adverse effect on the kidney is not known, but the treatment is known to raise the risks of the metabolic syndrome and cardiovascular disease. "A similar rationale can be postulated for the risk of acute kidney injury," Dr. Lapi and his associates said.
The dyslipidemia and hyperglycemia of the metabolic syndrome may promote tubular atrophy and interstitial fibrosis, and may impair glomerular function by expanding and thickening the membranes of the interstitial tubules. Both dyslipidemia and hyperglycemia also raise the risk of thrombosis and induce oxidative stress, which can impact renal function.
In addition, testosterone is thought to protect the kidneys by inducing vasodilation in the renal vessels and enhancing nitric oxide production. So, antagonizing testosterone could promote damage to the glomerulus. And the hypogonadism induced by androgen deprivation can also lead to estrogen deficiency, reducing estrogen’s protective effect against ischemic renal injury, the investigators said.
The study was supported by Prostate Cancer Canada, the Canadian Institutes of Health Research, and Fonds de recherche en Sant
FROM JAMA
Major finding: Men with prostate cancer who used androgen deprivation therapy had a significantly increased risk of acute kidney injury, with an odds ratio of 2.48, compared with those who didn’t use the therapy.
Data Source: A population-based case-control study involving 10,250 men aged 40 years and older, newly diagnosed with nonmetastatic prostate cancer, who were followed for a mean of 4 years for the development of acute kidney injury.
Disclosures: The study was supported by Prostate Cancer Canada, the Canadian Institutes of Health Research, and Fonds de recherche en Sant
CD123 differentiates acute GVHD from infections, drug reactions
CD123 expression in the intestinal mucosa may be a useful immunohistochemical marker to differentiate acute colonic graft-versus-host disease in hematopoietic stem-cell transplant patients who develop nonspecific gastrointestinal symptoms, according to Dr. Jingmei Lin and her colleagues.
Immunostaining endoscopic biopsy samples with CD123, an interleukin-3 receptor subunit, can identify plasmacytoid dendritic cells that are critical to the development of graft-versus-host disease (GVHD) but are not known to be present in infectious processes, adverse drug reactions, or chemoradiation toxicities. CD123 immunostaining was associated with a sensitivity of 66% and a specificity of 97% for acute GVHD, the researchers said (Hum. Pathol. 2013 June 21 [doi:10.1016/j.humpath.2013.02.023]).
Gastrointestinal GVHD can present as a variety of nonspecific symptoms and can be difficult to distinguish pathologically from colonic cytomegaloivrus, a major complication following stem-cell transplantation. Gastrointestinal GVHD also can be hard to differentiate pathologically from other opportunistic infections, such as Clostridium difficile and adenovirus infections, as well as from adverse reactions to chemotherapy and radiation. Early distinction is crucial because treatment approaches differ for these three entities and because early therapy improves outcomes for acute GVHD, said Dr. Lin of the departments of pathology and laboratory medicine, Indiana University, Indianapolis, and her associates.
The researchers reviewed 38 colonic endoscopy samples from stem-cell transplant recipients known to have gastrointestinal GVHD and compared them with 14 samples from patients who had not undergone transplantation and who were known to have cytomegalovirus colitis.
The researchers also assessed 11 biopsy samples (colon, stomach, small bowel, and esophagus) from patients who had taken mycophenolate, which is used for the prophylaxis of GVHD. They additionally assessed 47 biopsies (upper and lower GI) from patients who had undergone hematopoietic stem-cell transplantation but had not developed GVHD or infection, and 5 colon biopsies from control subjects.
All of the GVHD patients had presented with nonspecific symptoms of diarrhea, abdominal pain, abdominal cramping, nausea, and/or vomiting.
Among the 38 samples from patients with acute GVHD, 25 (66%) were positive on CD123 staining, showing plasmacytoid dendritic cells in the lamina propria. This marker increased in sensitivity as lesion grade increased: 60% of grade-1 and grade-2 lesions were positive, compared with 72% of grade-3 and grade-4 lesions.
In contrast, 2 of the 14 (14%) samples from patients with CMV, none of the 47 samples from transplant patients without GVHD, none of the 11 samples from patients who took mycophenolate, and none of the 5 control samples were positive for CD123.
The investigators said that they have no funding or conflicts of interest to disclose.
CD123 expression in the intestinal mucosa may be a useful immunohistochemical marker to differentiate acute colonic graft-versus-host disease in hematopoietic stem-cell transplant patients who develop nonspecific gastrointestinal symptoms, according to Dr. Jingmei Lin and her colleagues.
Immunostaining endoscopic biopsy samples with CD123, an interleukin-3 receptor subunit, can identify plasmacytoid dendritic cells that are critical to the development of graft-versus-host disease (GVHD) but are not known to be present in infectious processes, adverse drug reactions, or chemoradiation toxicities. CD123 immunostaining was associated with a sensitivity of 66% and a specificity of 97% for acute GVHD, the researchers said (Hum. Pathol. 2013 June 21 [doi:10.1016/j.humpath.2013.02.023]).
Gastrointestinal GVHD can present as a variety of nonspecific symptoms and can be difficult to distinguish pathologically from colonic cytomegaloivrus, a major complication following stem-cell transplantation. Gastrointestinal GVHD also can be hard to differentiate pathologically from other opportunistic infections, such as Clostridium difficile and adenovirus infections, as well as from adverse reactions to chemotherapy and radiation. Early distinction is crucial because treatment approaches differ for these three entities and because early therapy improves outcomes for acute GVHD, said Dr. Lin of the departments of pathology and laboratory medicine, Indiana University, Indianapolis, and her associates.
The researchers reviewed 38 colonic endoscopy samples from stem-cell transplant recipients known to have gastrointestinal GVHD and compared them with 14 samples from patients who had not undergone transplantation and who were known to have cytomegalovirus colitis.
The researchers also assessed 11 biopsy samples (colon, stomach, small bowel, and esophagus) from patients who had taken mycophenolate, which is used for the prophylaxis of GVHD. They additionally assessed 47 biopsies (upper and lower GI) from patients who had undergone hematopoietic stem-cell transplantation but had not developed GVHD or infection, and 5 colon biopsies from control subjects.
All of the GVHD patients had presented with nonspecific symptoms of diarrhea, abdominal pain, abdominal cramping, nausea, and/or vomiting.
Among the 38 samples from patients with acute GVHD, 25 (66%) were positive on CD123 staining, showing plasmacytoid dendritic cells in the lamina propria. This marker increased in sensitivity as lesion grade increased: 60% of grade-1 and grade-2 lesions were positive, compared with 72% of grade-3 and grade-4 lesions.
In contrast, 2 of the 14 (14%) samples from patients with CMV, none of the 47 samples from transplant patients without GVHD, none of the 11 samples from patients who took mycophenolate, and none of the 5 control samples were positive for CD123.
The investigators said that they have no funding or conflicts of interest to disclose.
CD123 expression in the intestinal mucosa may be a useful immunohistochemical marker to differentiate acute colonic graft-versus-host disease in hematopoietic stem-cell transplant patients who develop nonspecific gastrointestinal symptoms, according to Dr. Jingmei Lin and her colleagues.
Immunostaining endoscopic biopsy samples with CD123, an interleukin-3 receptor subunit, can identify plasmacytoid dendritic cells that are critical to the development of graft-versus-host disease (GVHD) but are not known to be present in infectious processes, adverse drug reactions, or chemoradiation toxicities. CD123 immunostaining was associated with a sensitivity of 66% and a specificity of 97% for acute GVHD, the researchers said (Hum. Pathol. 2013 June 21 [doi:10.1016/j.humpath.2013.02.023]).
Gastrointestinal GVHD can present as a variety of nonspecific symptoms and can be difficult to distinguish pathologically from colonic cytomegaloivrus, a major complication following stem-cell transplantation. Gastrointestinal GVHD also can be hard to differentiate pathologically from other opportunistic infections, such as Clostridium difficile and adenovirus infections, as well as from adverse reactions to chemotherapy and radiation. Early distinction is crucial because treatment approaches differ for these three entities and because early therapy improves outcomes for acute GVHD, said Dr. Lin of the departments of pathology and laboratory medicine, Indiana University, Indianapolis, and her associates.
The researchers reviewed 38 colonic endoscopy samples from stem-cell transplant recipients known to have gastrointestinal GVHD and compared them with 14 samples from patients who had not undergone transplantation and who were known to have cytomegalovirus colitis.
The researchers also assessed 11 biopsy samples (colon, stomach, small bowel, and esophagus) from patients who had taken mycophenolate, which is used for the prophylaxis of GVHD. They additionally assessed 47 biopsies (upper and lower GI) from patients who had undergone hematopoietic stem-cell transplantation but had not developed GVHD or infection, and 5 colon biopsies from control subjects.
All of the GVHD patients had presented with nonspecific symptoms of diarrhea, abdominal pain, abdominal cramping, nausea, and/or vomiting.
Among the 38 samples from patients with acute GVHD, 25 (66%) were positive on CD123 staining, showing plasmacytoid dendritic cells in the lamina propria. This marker increased in sensitivity as lesion grade increased: 60% of grade-1 and grade-2 lesions were positive, compared with 72% of grade-3 and grade-4 lesions.
In contrast, 2 of the 14 (14%) samples from patients with CMV, none of the 47 samples from transplant patients without GVHD, none of the 11 samples from patients who took mycophenolate, and none of the 5 control samples were positive for CD123.
The investigators said that they have no funding or conflicts of interest to disclose.
FROM HUMAN PATHOLOGY
Major finding: Among 38 colonic endoscopy samples from patients with acute GVHD, 25 (66%) were positive on CD123 staining. The marker increased in sensitivity as lesion grade increased: 60% of grade-1 and grade-2 lesions were positive, compared with 72% of grade-3 and grade-4 lesions.
Data source: A review of endoscopic biopsy samples from stem-cell transplant recipients known to have gastrointestinal GVHD, patients known to have cytomegalovirus colitis, patients who had taken mycophenolate, patients who had hematopoietic stem cell transplants and had not developed GVHD or infection, and control subjects.
Disclosures: The investigators said that they have no funding or conflicts of interest to disclose.
Home + pharmacist BP telemonitoring found successful
An intervention involving home blood pressure monitoring and case management by a pharmacist achieved better control of hypertension than did usual care provided by a primary-care physician, in a study reported July 3 in JAMA.
This positive effect lasted well beyond the 1-year treatment period and through an additional 6 months of follow-up after the intervention was stopped, said Dr. Karen L. Margolis of HealthPartners Institute for Education and Research, Minneapolis, and her associates.
"If these results are found to be cost-effective and durable during an even longer period, it should spur wider testing and dissemination of similar alternative models of care for managing hypertension and other chronic conditions," they said.
The investigators performed the Home Blood Pressure Telemonitoring and Case Management to Control Hypertension (HyperLink) study to determine whether the intervention was safe, effective, and durable in real-world patients "representative of the range of comorbidity and hypertension severity in typical primary-care practice."
The cluster-randomized study was conducted at 16 primary-care clinics in a multispecialty practice that was part of an integrated health system. These clinics had an existing arrangement between primary-care physicians and pharmacists allowing the pharmacists to prescribe and change antihypertensive therapy according to specified protocols.
The clinics were matched by size and randomized to continue usual hypertension care managed by the primary-care physician (8 clinics) or to implement the telemonitoring intervention (8 clinics). A total of 450 patients with uncontrolled hypertension were included: 228 assigned to the intervention and 222 assigned to usual care.
The mean patient age was 61 years. The study population was almost equally divided between men and women, and 82% were white. Comorbidities were common, including obesity (54%), diabetes (19%), chronic kidney disease (19%), and cardiovascular disease (10%). The mean BP was 148/85 mm Hg.
In the intervention group, patients met with pharmacists for 1 hour to review their history, get general information about hypertension, and receive instructions for operating the home BP monitor that stored and transmitted data to a secure website accessed by the pharmacist. These study subjects were told to transmit at least 3 morning and 3 evening BP measurements each week. Pharmacists retrieved the information and altered medications accordingly.
The pharmacists and patients also consulted via telephone every 2 weeks until BP control was sustained for 6 weeks, and then their phone "visits" were decreased to once per month. After the first 6 months of the intervention, phone calls were scaled back to once every 2 months.
At these visits, the pharmacists discussed lifestyle changes, medication adherence, and adverse effects of medication, and they adjusted antihypertensive medications as necessary. They communicated with patients’ primary-care physicians through the electronic medical record after each call.
At 12 months, the intervention stopped. Patients discontinued using the telemonitors and returned to the care of their primary physicians.
During this intervention and for 6 months thereafter, the patients made periodic visits to the research clinic so the safety and efficacy of the intervention could be monitored.
The primary outcome was the percentage of patients with controlled BP, defined as <140/90 mm Hg, or <130/80 mm Hg if they had concomitant diabetes or kidney disease.
At 6 months, this outcome was attained by 72% of the intervention group, compared with 45% of the usual-care group. At 12 months, the corresponding rates were 71% and 53%, respectively. And at 18 months, they were 72% and 57%, respectively.
Overall, the intervention group achieved 25%-30% higher rates of BP control than did the usual-care group, Dr. Margolis and her associates said (JAMA 2013;310:46-56).
Patient satisfaction with care was similar between the two study groups. At 6 months, more patients who received the intervention felt their clinicians listened carefully, explained things clearly, and respected them than did patients who received usual care. However, this difference was no longer present at 12 or 18 months.
Patients who received the intervention were "substantially more confident" than were those who received usual care regarding communication with their health care team, mastering of the home BP monitoring routine, following their medication regimen, and keeping their BP under control.
The intervention group also self-reported that they added less salt to their food than did the control group at all time points, "but other lifestyle factors did not differ" between the two study groups.
Dr. Margolis and her associates calculated that the direct cost of this intervention would total $1,350 per patient.
This study was limited in that a very large number of potentially eligible patients (nearly 15,000) had to be screened to obtain a relatively small study population of 450 subjects. Also, these study subjects were, in general, well educated and had correspondingly high income levels, and approximately half had used a home BP monitor before, so they were not representative of the general population.
"We conclude that BP telemonitoring and pharmacist case-management was safe and effective for improving BP control compared with usual care during 12 months; and improved BP in the intervention group was maintained for 6 months following the intervention," they said.
"We plan future analyses that will take into account indirect costs during 18 months and long-term cost savings from averting hypertension-related events," they added.
No relevant financial conflicts of interest were reported. HealthPartners Institute for Education and Research has entered a royalty-bearing license agreement to commercialize a simulated learning technology for the purpose of broader dissemination.
Given the "consistent and substantial" evidence obtained in this and other studies, it is clear that moving hypertension care out of the office and into patients’ homes is safe and effective, said Dr. David J. Magid and Dr. Beverly B. Green.
Yet home-based HT management has not been widely adopted in the United States and isn’t likely to be, unless the current system of reimbursement and performance measurement is changed. Medical insurance must cover patients’ costs for BP telemonitors and reimburse providers for their related services. And home BP must be included in quality assurance assessments of HT care.
"If home BP monitoring and team-based care were implemented broadly, hypertension management would be easier for patients, and the magnitude of BP reductions ... could lead to substantial reductions in cardiovascular events and mortality," they said.
Dr. Magid is at Kaiser Permanente Colorado Institute for Health Research, Denver. Dr. Green is at Group Health Research Institute at the University of Washington, Seattle. They reported no financial conflicts of interest. These remarks were taken from their editorial accompanying Dr. Margolis’s report (JAMA 2013;310:40-1).
Given the "consistent and substantial" evidence obtained in this and other studies, it is clear that moving hypertension care out of the office and into patients’ homes is safe and effective, said Dr. David J. Magid and Dr. Beverly B. Green.
Yet home-based HT management has not been widely adopted in the United States and isn’t likely to be, unless the current system of reimbursement and performance measurement is changed. Medical insurance must cover patients’ costs for BP telemonitors and reimburse providers for their related services. And home BP must be included in quality assurance assessments of HT care.
"If home BP monitoring and team-based care were implemented broadly, hypertension management would be easier for patients, and the magnitude of BP reductions ... could lead to substantial reductions in cardiovascular events and mortality," they said.
Dr. Magid is at Kaiser Permanente Colorado Institute for Health Research, Denver. Dr. Green is at Group Health Research Institute at the University of Washington, Seattle. They reported no financial conflicts of interest. These remarks were taken from their editorial accompanying Dr. Margolis’s report (JAMA 2013;310:40-1).
Given the "consistent and substantial" evidence obtained in this and other studies, it is clear that moving hypertension care out of the office and into patients’ homes is safe and effective, said Dr. David J. Magid and Dr. Beverly B. Green.
Yet home-based HT management has not been widely adopted in the United States and isn’t likely to be, unless the current system of reimbursement and performance measurement is changed. Medical insurance must cover patients’ costs for BP telemonitors and reimburse providers for their related services. And home BP must be included in quality assurance assessments of HT care.
"If home BP monitoring and team-based care were implemented broadly, hypertension management would be easier for patients, and the magnitude of BP reductions ... could lead to substantial reductions in cardiovascular events and mortality," they said.
Dr. Magid is at Kaiser Permanente Colorado Institute for Health Research, Denver. Dr. Green is at Group Health Research Institute at the University of Washington, Seattle. They reported no financial conflicts of interest. These remarks were taken from their editorial accompanying Dr. Margolis’s report (JAMA 2013;310:40-1).
An intervention involving home blood pressure monitoring and case management by a pharmacist achieved better control of hypertension than did usual care provided by a primary-care physician, in a study reported July 3 in JAMA.
This positive effect lasted well beyond the 1-year treatment period and through an additional 6 months of follow-up after the intervention was stopped, said Dr. Karen L. Margolis of HealthPartners Institute for Education and Research, Minneapolis, and her associates.
"If these results are found to be cost-effective and durable during an even longer period, it should spur wider testing and dissemination of similar alternative models of care for managing hypertension and other chronic conditions," they said.
The investigators performed the Home Blood Pressure Telemonitoring and Case Management to Control Hypertension (HyperLink) study to determine whether the intervention was safe, effective, and durable in real-world patients "representative of the range of comorbidity and hypertension severity in typical primary-care practice."
The cluster-randomized study was conducted at 16 primary-care clinics in a multispecialty practice that was part of an integrated health system. These clinics had an existing arrangement between primary-care physicians and pharmacists allowing the pharmacists to prescribe and change antihypertensive therapy according to specified protocols.
The clinics were matched by size and randomized to continue usual hypertension care managed by the primary-care physician (8 clinics) or to implement the telemonitoring intervention (8 clinics). A total of 450 patients with uncontrolled hypertension were included: 228 assigned to the intervention and 222 assigned to usual care.
The mean patient age was 61 years. The study population was almost equally divided between men and women, and 82% were white. Comorbidities were common, including obesity (54%), diabetes (19%), chronic kidney disease (19%), and cardiovascular disease (10%). The mean BP was 148/85 mm Hg.
In the intervention group, patients met with pharmacists for 1 hour to review their history, get general information about hypertension, and receive instructions for operating the home BP monitor that stored and transmitted data to a secure website accessed by the pharmacist. These study subjects were told to transmit at least 3 morning and 3 evening BP measurements each week. Pharmacists retrieved the information and altered medications accordingly.
The pharmacists and patients also consulted via telephone every 2 weeks until BP control was sustained for 6 weeks, and then their phone "visits" were decreased to once per month. After the first 6 months of the intervention, phone calls were scaled back to once every 2 months.
At these visits, the pharmacists discussed lifestyle changes, medication adherence, and adverse effects of medication, and they adjusted antihypertensive medications as necessary. They communicated with patients’ primary-care physicians through the electronic medical record after each call.
At 12 months, the intervention stopped. Patients discontinued using the telemonitors and returned to the care of their primary physicians.
During this intervention and for 6 months thereafter, the patients made periodic visits to the research clinic so the safety and efficacy of the intervention could be monitored.
The primary outcome was the percentage of patients with controlled BP, defined as <140/90 mm Hg, or <130/80 mm Hg if they had concomitant diabetes or kidney disease.
At 6 months, this outcome was attained by 72% of the intervention group, compared with 45% of the usual-care group. At 12 months, the corresponding rates were 71% and 53%, respectively. And at 18 months, they were 72% and 57%, respectively.
Overall, the intervention group achieved 25%-30% higher rates of BP control than did the usual-care group, Dr. Margolis and her associates said (JAMA 2013;310:46-56).
Patient satisfaction with care was similar between the two study groups. At 6 months, more patients who received the intervention felt their clinicians listened carefully, explained things clearly, and respected them than did patients who received usual care. However, this difference was no longer present at 12 or 18 months.
Patients who received the intervention were "substantially more confident" than were those who received usual care regarding communication with their health care team, mastering of the home BP monitoring routine, following their medication regimen, and keeping their BP under control.
The intervention group also self-reported that they added less salt to their food than did the control group at all time points, "but other lifestyle factors did not differ" between the two study groups.
Dr. Margolis and her associates calculated that the direct cost of this intervention would total $1,350 per patient.
This study was limited in that a very large number of potentially eligible patients (nearly 15,000) had to be screened to obtain a relatively small study population of 450 subjects. Also, these study subjects were, in general, well educated and had correspondingly high income levels, and approximately half had used a home BP monitor before, so they were not representative of the general population.
"We conclude that BP telemonitoring and pharmacist case-management was safe and effective for improving BP control compared with usual care during 12 months; and improved BP in the intervention group was maintained for 6 months following the intervention," they said.
"We plan future analyses that will take into account indirect costs during 18 months and long-term cost savings from averting hypertension-related events," they added.
No relevant financial conflicts of interest were reported. HealthPartners Institute for Education and Research has entered a royalty-bearing license agreement to commercialize a simulated learning technology for the purpose of broader dissemination.
An intervention involving home blood pressure monitoring and case management by a pharmacist achieved better control of hypertension than did usual care provided by a primary-care physician, in a study reported July 3 in JAMA.
This positive effect lasted well beyond the 1-year treatment period and through an additional 6 months of follow-up after the intervention was stopped, said Dr. Karen L. Margolis of HealthPartners Institute for Education and Research, Minneapolis, and her associates.
"If these results are found to be cost-effective and durable during an even longer period, it should spur wider testing and dissemination of similar alternative models of care for managing hypertension and other chronic conditions," they said.
The investigators performed the Home Blood Pressure Telemonitoring and Case Management to Control Hypertension (HyperLink) study to determine whether the intervention was safe, effective, and durable in real-world patients "representative of the range of comorbidity and hypertension severity in typical primary-care practice."
The cluster-randomized study was conducted at 16 primary-care clinics in a multispecialty practice that was part of an integrated health system. These clinics had an existing arrangement between primary-care physicians and pharmacists allowing the pharmacists to prescribe and change antihypertensive therapy according to specified protocols.
The clinics were matched by size and randomized to continue usual hypertension care managed by the primary-care physician (8 clinics) or to implement the telemonitoring intervention (8 clinics). A total of 450 patients with uncontrolled hypertension were included: 228 assigned to the intervention and 222 assigned to usual care.
The mean patient age was 61 years. The study population was almost equally divided between men and women, and 82% were white. Comorbidities were common, including obesity (54%), diabetes (19%), chronic kidney disease (19%), and cardiovascular disease (10%). The mean BP was 148/85 mm Hg.
In the intervention group, patients met with pharmacists for 1 hour to review their history, get general information about hypertension, and receive instructions for operating the home BP monitor that stored and transmitted data to a secure website accessed by the pharmacist. These study subjects were told to transmit at least 3 morning and 3 evening BP measurements each week. Pharmacists retrieved the information and altered medications accordingly.
The pharmacists and patients also consulted via telephone every 2 weeks until BP control was sustained for 6 weeks, and then their phone "visits" were decreased to once per month. After the first 6 months of the intervention, phone calls were scaled back to once every 2 months.
At these visits, the pharmacists discussed lifestyle changes, medication adherence, and adverse effects of medication, and they adjusted antihypertensive medications as necessary. They communicated with patients’ primary-care physicians through the electronic medical record after each call.
At 12 months, the intervention stopped. Patients discontinued using the telemonitors and returned to the care of their primary physicians.
During this intervention and for 6 months thereafter, the patients made periodic visits to the research clinic so the safety and efficacy of the intervention could be monitored.
The primary outcome was the percentage of patients with controlled BP, defined as <140/90 mm Hg, or <130/80 mm Hg if they had concomitant diabetes or kidney disease.
At 6 months, this outcome was attained by 72% of the intervention group, compared with 45% of the usual-care group. At 12 months, the corresponding rates were 71% and 53%, respectively. And at 18 months, they were 72% and 57%, respectively.
Overall, the intervention group achieved 25%-30% higher rates of BP control than did the usual-care group, Dr. Margolis and her associates said (JAMA 2013;310:46-56).
Patient satisfaction with care was similar between the two study groups. At 6 months, more patients who received the intervention felt their clinicians listened carefully, explained things clearly, and respected them than did patients who received usual care. However, this difference was no longer present at 12 or 18 months.
Patients who received the intervention were "substantially more confident" than were those who received usual care regarding communication with their health care team, mastering of the home BP monitoring routine, following their medication regimen, and keeping their BP under control.
The intervention group also self-reported that they added less salt to their food than did the control group at all time points, "but other lifestyle factors did not differ" between the two study groups.
Dr. Margolis and her associates calculated that the direct cost of this intervention would total $1,350 per patient.
This study was limited in that a very large number of potentially eligible patients (nearly 15,000) had to be screened to obtain a relatively small study population of 450 subjects. Also, these study subjects were, in general, well educated and had correspondingly high income levels, and approximately half had used a home BP monitor before, so they were not representative of the general population.
"We conclude that BP telemonitoring and pharmacist case-management was safe and effective for improving BP control compared with usual care during 12 months; and improved BP in the intervention group was maintained for 6 months following the intervention," they said.
"We plan future analyses that will take into account indirect costs during 18 months and long-term cost savings from averting hypertension-related events," they added.
No relevant financial conflicts of interest were reported. HealthPartners Institute for Education and Research has entered a royalty-bearing license agreement to commercialize a simulated learning technology for the purpose of broader dissemination.
FROM JAMA
Major finding: BP control was attained in the intervention group by 72% at 6 months, 71% at 12 months, and 72% at 18 months, compared with 45%, 53%, and 57%, respectively, in the usual-care group.
Data source: A cluster-randomized clinical trial comparing BP control between 228 adults who participated in a 1-year intervention and 222 who received usual care.
Disclosures: No relevant financial conflicts of interest were reported. HealthPartners Institute for Education and Research has entered a royalty-bearing license agreement to commercialize a simulated learning technology for the purpose of broader dissemination.
IVF tied to small rise in mental retardation rate
Children born after in vitro fertilization had a small but significant increase in the incidence of mental retardation in a nationwide Swedish study of 2.5 million births during a 25-year period, according to a report published in the July 3 issue of JAMA.
In contrast, the rate of autistic disorder was not increased among children born after IVF treatment, compared with those born after spontaneous conception.
However, when the data were broken down by type of IVF procedure, the use of one technique – intracytoplasmic sperm injection (ICSI) for paternal infertility – was associated with a small increase in the incidence of autistic disorder, said Dr. Sven Sandin of the Institute of Psychiatry, King’s College London, and his associates.
The latter findings may be particularly important in countries such as the United States where ICSI is often used even when the sperm sample is normal, "because of a presumed (but unproven) higher efficiency," they noted.
Dr. Sandin and his colleagues performed a prospective cohort study to test the hypothesis that IVF in general and ICSI in particular would be associated with an increased risk of mental retardation and autistic disorder. Both IVF and ICSI are known to raise the risk of perinatal complications and preterm birth, which in turn raise the risk of neurodevelopmental abnormalities.
IVF also has been linked with several specific neurological disorders including cerebral palsy, and Russell-Silver, Beckwith-Wiedemann, and Angelman syndromes. There also is concern that ICSI may allow fertilization with suboptimal sperm because it bypasses the natural selection of sperm; that it may physically damage the egg; and that it may contaminate the cytoplasm of the egg cell with culture media when the sperm is inserted, they said.
The study population comprised 2,541,125 children born in 1982-2007 and followed for a mean of 10 years. Among these children, 30,959 were born following an IVF procedure.
A total of 15,830 children had mental retardation, including 180 (1.1%) who were born after IVF. "Compared with offspring born following spontaneous conception, those born after an IVF procedure had a statistically significantly increased risk of mental retardation (relative risk, 1.18)," said Dr. Sandin, who is also in the department of medical epidemiology and biostatistics, Karolinska Institutet, Stockholm, and his associates.
However, the absolute difference in rates of mental retardation was small, at fewer than 7 cases per 100,000 person-years.
A total of 6,959 children had autistic disorder, including 103 (1.5%) who were born after IVF. Compared with children born after spontaneous conception, those born after any IVF procedure did not have a significantly increased risk of autism disorder.
However, this risk was significantly greater after one specific procedure, ICSI using surgically extracted sperm with fresh embryos, with an RR of 4.60, the investigators said (JAMA 2013;310:75-84).
The risk for mental retardation also increased in children born after ICSI using surgically extracted sperm with fresh embryos, with an RR of 2.35.
The risks for both mental retardation and autism disorder increased further in preterm births and in cases of multiple gestation.
In contrast, the risks for mental retardation and autistic disorder showed no association with IVF procedures regardless of whether blastocyst transfer, cleavage-stage transfer, frozen embryos, or fresh embryos were used.
Hormone stimulation is part of IVF, and some in the medical community have suggested that the use of hormones, not IVF per se, may account for any increased risk for autistic or other neurodevelopmental disorders. To control for this possibility, Dr. Sandin and his associates separately compared outcomes in children born to mothers who used only hormone therapy without any IVF procedure. They found no increase in risk for autistic disorder or mental retardation in this subset of patients.
To account for other factors that might contribute to neurodevelopmental risk, the investigators adjusted the data for parental age and psychiatric history. This had no effect on IVF-associated risks for mental retardation or autistic disorder. Duration of infertility also had no effect on these risks.
The researchers were unable to account for the number of embryos transferred during IVF, because that information was not available in the medical records before 2003.
They also had no information on parental educational attainment or socioeconomic status, which might potentially skew the study population toward more affluent couples who could afford multiple cycles of IVF. However, in Sweden, up to three IVF treatment cycles are free for childless women, so any bias of this type would likely be small, the researchers said.
"Our results should be applicable to most countries where IVF and ICSI are used. There are no major differences in equipment or laboratory work across countries," they added.
The link between multiple births and preterm births on one hand and the risk of neurodevelopmental disorders on the other is a particularly important finding, because decreasing the number of multiple births is now a primary goal of assisted reproductive technologies, said Dr. Marcelle I. Cedars.
|
|
"The increased risk of autistic disorder and mental retardation, largely accounted for by multiple pregnancies and preterm delivery, should provide another opportunity for reproductive health physicians to educate patients and other physicians about the importance of limiting embryo transfer number," she said.
"Even though the data are reassuring regarding the absence of risk of autistic disorder and small absolute risk of mental retardation with IVF, continued study of the implications of ovarian stimulation, embryo culture, and multiple embryo transfer is required."
Dr. Cedars is in the division of reproductive endocrinology and infertility at the University of California–San Francisco and directs the UCSF Women’s Health Clinical Research Center. She reported no relevant financial conflicts of interest. These remarks were taken from her comments in an editorial accompanying Dr. Sandin’s report (JAMA 2013;310:42-3).
The link between multiple births and preterm births on one hand and the risk of neurodevelopmental disorders on the other is a particularly important finding, because decreasing the number of multiple births is now a primary goal of assisted reproductive technologies, said Dr. Marcelle I. Cedars.
|
|
"The increased risk of autistic disorder and mental retardation, largely accounted for by multiple pregnancies and preterm delivery, should provide another opportunity for reproductive health physicians to educate patients and other physicians about the importance of limiting embryo transfer number," she said.
"Even though the data are reassuring regarding the absence of risk of autistic disorder and small absolute risk of mental retardation with IVF, continued study of the implications of ovarian stimulation, embryo culture, and multiple embryo transfer is required."
Dr. Cedars is in the division of reproductive endocrinology and infertility at the University of California–San Francisco and directs the UCSF Women’s Health Clinical Research Center. She reported no relevant financial conflicts of interest. These remarks were taken from her comments in an editorial accompanying Dr. Sandin’s report (JAMA 2013;310:42-3).
The link between multiple births and preterm births on one hand and the risk of neurodevelopmental disorders on the other is a particularly important finding, because decreasing the number of multiple births is now a primary goal of assisted reproductive technologies, said Dr. Marcelle I. Cedars.
|
|
"The increased risk of autistic disorder and mental retardation, largely accounted for by multiple pregnancies and preterm delivery, should provide another opportunity for reproductive health physicians to educate patients and other physicians about the importance of limiting embryo transfer number," she said.
"Even though the data are reassuring regarding the absence of risk of autistic disorder and small absolute risk of mental retardation with IVF, continued study of the implications of ovarian stimulation, embryo culture, and multiple embryo transfer is required."
Dr. Cedars is in the division of reproductive endocrinology and infertility at the University of California–San Francisco and directs the UCSF Women’s Health Clinical Research Center. She reported no relevant financial conflicts of interest. These remarks were taken from her comments in an editorial accompanying Dr. Sandin’s report (JAMA 2013;310:42-3).
Children born after in vitro fertilization had a small but significant increase in the incidence of mental retardation in a nationwide Swedish study of 2.5 million births during a 25-year period, according to a report published in the July 3 issue of JAMA.
In contrast, the rate of autistic disorder was not increased among children born after IVF treatment, compared with those born after spontaneous conception.
However, when the data were broken down by type of IVF procedure, the use of one technique – intracytoplasmic sperm injection (ICSI) for paternal infertility – was associated with a small increase in the incidence of autistic disorder, said Dr. Sven Sandin of the Institute of Psychiatry, King’s College London, and his associates.
The latter findings may be particularly important in countries such as the United States where ICSI is often used even when the sperm sample is normal, "because of a presumed (but unproven) higher efficiency," they noted.
Dr. Sandin and his colleagues performed a prospective cohort study to test the hypothesis that IVF in general and ICSI in particular would be associated with an increased risk of mental retardation and autistic disorder. Both IVF and ICSI are known to raise the risk of perinatal complications and preterm birth, which in turn raise the risk of neurodevelopmental abnormalities.
IVF also has been linked with several specific neurological disorders including cerebral palsy, and Russell-Silver, Beckwith-Wiedemann, and Angelman syndromes. There also is concern that ICSI may allow fertilization with suboptimal sperm because it bypasses the natural selection of sperm; that it may physically damage the egg; and that it may contaminate the cytoplasm of the egg cell with culture media when the sperm is inserted, they said.
The study population comprised 2,541,125 children born in 1982-2007 and followed for a mean of 10 years. Among these children, 30,959 were born following an IVF procedure.
A total of 15,830 children had mental retardation, including 180 (1.1%) who were born after IVF. "Compared with offspring born following spontaneous conception, those born after an IVF procedure had a statistically significantly increased risk of mental retardation (relative risk, 1.18)," said Dr. Sandin, who is also in the department of medical epidemiology and biostatistics, Karolinska Institutet, Stockholm, and his associates.
However, the absolute difference in rates of mental retardation was small, at fewer than 7 cases per 100,000 person-years.
A total of 6,959 children had autistic disorder, including 103 (1.5%) who were born after IVF. Compared with children born after spontaneous conception, those born after any IVF procedure did not have a significantly increased risk of autism disorder.
However, this risk was significantly greater after one specific procedure, ICSI using surgically extracted sperm with fresh embryos, with an RR of 4.60, the investigators said (JAMA 2013;310:75-84).
The risk for mental retardation also increased in children born after ICSI using surgically extracted sperm with fresh embryos, with an RR of 2.35.
The risks for both mental retardation and autism disorder increased further in preterm births and in cases of multiple gestation.
In contrast, the risks for mental retardation and autistic disorder showed no association with IVF procedures regardless of whether blastocyst transfer, cleavage-stage transfer, frozen embryos, or fresh embryos were used.
Hormone stimulation is part of IVF, and some in the medical community have suggested that the use of hormones, not IVF per se, may account for any increased risk for autistic or other neurodevelopmental disorders. To control for this possibility, Dr. Sandin and his associates separately compared outcomes in children born to mothers who used only hormone therapy without any IVF procedure. They found no increase in risk for autistic disorder or mental retardation in this subset of patients.
To account for other factors that might contribute to neurodevelopmental risk, the investigators adjusted the data for parental age and psychiatric history. This had no effect on IVF-associated risks for mental retardation or autistic disorder. Duration of infertility also had no effect on these risks.
The researchers were unable to account for the number of embryos transferred during IVF, because that information was not available in the medical records before 2003.
They also had no information on parental educational attainment or socioeconomic status, which might potentially skew the study population toward more affluent couples who could afford multiple cycles of IVF. However, in Sweden, up to three IVF treatment cycles are free for childless women, so any bias of this type would likely be small, the researchers said.
"Our results should be applicable to most countries where IVF and ICSI are used. There are no major differences in equipment or laboratory work across countries," they added.
Children born after in vitro fertilization had a small but significant increase in the incidence of mental retardation in a nationwide Swedish study of 2.5 million births during a 25-year period, according to a report published in the July 3 issue of JAMA.
In contrast, the rate of autistic disorder was not increased among children born after IVF treatment, compared with those born after spontaneous conception.
However, when the data were broken down by type of IVF procedure, the use of one technique – intracytoplasmic sperm injection (ICSI) for paternal infertility – was associated with a small increase in the incidence of autistic disorder, said Dr. Sven Sandin of the Institute of Psychiatry, King’s College London, and his associates.
The latter findings may be particularly important in countries such as the United States where ICSI is often used even when the sperm sample is normal, "because of a presumed (but unproven) higher efficiency," they noted.
Dr. Sandin and his colleagues performed a prospective cohort study to test the hypothesis that IVF in general and ICSI in particular would be associated with an increased risk of mental retardation and autistic disorder. Both IVF and ICSI are known to raise the risk of perinatal complications and preterm birth, which in turn raise the risk of neurodevelopmental abnormalities.
IVF also has been linked with several specific neurological disorders including cerebral palsy, and Russell-Silver, Beckwith-Wiedemann, and Angelman syndromes. There also is concern that ICSI may allow fertilization with suboptimal sperm because it bypasses the natural selection of sperm; that it may physically damage the egg; and that it may contaminate the cytoplasm of the egg cell with culture media when the sperm is inserted, they said.
The study population comprised 2,541,125 children born in 1982-2007 and followed for a mean of 10 years. Among these children, 30,959 were born following an IVF procedure.
A total of 15,830 children had mental retardation, including 180 (1.1%) who were born after IVF. "Compared with offspring born following spontaneous conception, those born after an IVF procedure had a statistically significantly increased risk of mental retardation (relative risk, 1.18)," said Dr. Sandin, who is also in the department of medical epidemiology and biostatistics, Karolinska Institutet, Stockholm, and his associates.
However, the absolute difference in rates of mental retardation was small, at fewer than 7 cases per 100,000 person-years.
A total of 6,959 children had autistic disorder, including 103 (1.5%) who were born after IVF. Compared with children born after spontaneous conception, those born after any IVF procedure did not have a significantly increased risk of autism disorder.
However, this risk was significantly greater after one specific procedure, ICSI using surgically extracted sperm with fresh embryos, with an RR of 4.60, the investigators said (JAMA 2013;310:75-84).
The risk for mental retardation also increased in children born after ICSI using surgically extracted sperm with fresh embryos, with an RR of 2.35.
The risks for both mental retardation and autism disorder increased further in preterm births and in cases of multiple gestation.
In contrast, the risks for mental retardation and autistic disorder showed no association with IVF procedures regardless of whether blastocyst transfer, cleavage-stage transfer, frozen embryos, or fresh embryos were used.
Hormone stimulation is part of IVF, and some in the medical community have suggested that the use of hormones, not IVF per se, may account for any increased risk for autistic or other neurodevelopmental disorders. To control for this possibility, Dr. Sandin and his associates separately compared outcomes in children born to mothers who used only hormone therapy without any IVF procedure. They found no increase in risk for autistic disorder or mental retardation in this subset of patients.
To account for other factors that might contribute to neurodevelopmental risk, the investigators adjusted the data for parental age and psychiatric history. This had no effect on IVF-associated risks for mental retardation or autistic disorder. Duration of infertility also had no effect on these risks.
The researchers were unable to account for the number of embryos transferred during IVF, because that information was not available in the medical records before 2003.
They also had no information on parental educational attainment or socioeconomic status, which might potentially skew the study population toward more affluent couples who could afford multiple cycles of IVF. However, in Sweden, up to three IVF treatment cycles are free for childless women, so any bias of this type would likely be small, the researchers said.
"Our results should be applicable to most countries where IVF and ICSI are used. There are no major differences in equipment or laboratory work across countries," they added.
FROM JAMA
Major finding: Compared with children born after spontaneous conception, those born after any IVF procedure had a significantly increased risk of mental retardation (RR, 1.18); however, the absolute difference in rates of mental retardation was small, at fewer than 7 cases per 100,000 person-years.
Data source: A prospective cohort study involving 2.5 million births in Sweden in 1982-2007, including 30,959 births following IVF treatment.
Disclosures: This study was funded by Autism Speaks and the Swedish Research Council. No financial conflicts of interest were reported.
Add-on salsalate improved poorly controlled type 2 diabetes
The salicylate prodrug salsalate may be an effective add-on therapy for poorly controlled type 2 diabetes, according to a report published online July 1 in Annals of Internal Medicine.
In a 1-year randomized, controlled trial involving 286 patients already taking one to three medications for diabetes, adding daily oral salsalate consistently reduced hemoglobin A1c and fasting glucose levels, as well as improving other cardiometabolic risk factors, said Dr. Allison B. Goldfine of Harvard Medical School, Boston, and her associates.
The magnitude of the treatment effect was similar to that reported for other add-on diabetes therapies currently in use.
The treatment was well tolerated, no serious adverse events occurred, and there was no evidence of gastrointestinal adverse effects. However, some changes in renal function and cholesterol levels "require continued evaluation before salsalate can be recommended for widespread use in type 2 diabetes," the investigators noted.
Dr. Goldfine and her colleagues performed the clinical trial after proof-of-principle studies showed that salsalate reduced blood glucose, triglycerides, free fatty acids, and C-reactive protein concentrations; improved glucose utilization; and increased circulating insulin and adiponectin levels. Their trial was intended to assess the magnitude and durability of the drug’s glycemic efficacy, as well as its tolerability, when taken for 1 year.
The study patients treated at 21 U.S. sites were adults who were aged 75 years or younger and who had HbA1c levels of 7.0%-9.5% and were already being managed with lifestyle modification, metformin, insulin, secretagogues, and/or dipeptidyl peptidase-4 inhibitors, alone or in combination. They were randomly assigned to add three daily doses of either salsalate (3 g daily for 2 weeks, then 3.5 g daily) or matching placebo to their drug regimen for 48 weeks.
Patients maintained stable doses of lipid-lowering and hypertension medications whenever possible for the course of the study. They were followed frequently to assess safety factors, treatment adherence, and treatment response.
The primary efficacy outcome was change in HbA1c level at 48 weeks. The salsalate group’s mean level was 0.37 percentage points lower than the control group’s, a significant difference. Moreover, the difference between the two study groups in HbA1c levels was significant at all of the frequent assessments, including the initial assessment after just 4 weeks of treatment.
In addition, the mean HbA1c level was 0.33 percentage points lower in the salsalate group at the end of the study than it had been at baseline, a significant difference. In contrast, the mean HbA1c level was essentially unchanged over time in the control group, with a mean, nonsignificant increase of 0.04 percentage points at week 48.
The magnitude of the treatment benefit was greatest among patients who had the highest HbA1c levels at baseline. For every 1% increase in baseline HbA1c, the mean reduction in HbA1c was 0.43 percentage points, the investigators said.
At the conclusion of the study, more patients receiving salsalate (41%) had achieved 0.5 percentage points or greater decreases in HbA1c levels than those receiving placebo (23%).
Consistent with these changes in HbA1c, fasting glucose levels also showed significant improvement only in the patients receiving salsalate.
Reductions in and discontinuation of other diabetes medications also were significantly more frequent with salsalate (62%) than with placebo (13%). "Conversely, concomitant diabetes medications were increased and new therapies instituted more frequently for patients receiving placebo (87%) than those receiving salsalate (38%)," Dr. Goldfine and her associates said.
Among the active drug’s anti-inflammatory effects were decreases in circulating leukocytes, neutrophils, and lymphocytes.
Levels of adiponectin, a cardioprotective protein derived from adipocytes, rose by 27% in the salsalate group, compared with the placebo group. Levels of uric acid, which is associated with cardiometabolic disorders and the progression of renal insufficiency, dropped by 18% in the salsalate group, compared with the placebo group.
Salsalate also reduced triglyceride levels. However, the drug increased total and LDL cholesterol levels without altering HDL cholesterol levels. It also increased the urinary albumin-creatinine ratio and increased serum creatinine levels. And it was associated with a 1.3-kg increase in weight.
These modest adverse effects "warrant further assessment," the investigators said.
No serious adverse events were attributed to salsalate. However, the relative risk for mild hypoglycemia was six times greater when salsalate was added to sulfonylureas than when placebo was.
Mild tinnitus was reported by more patients receiving salsalate (11%) than placebo (5%), but it resolved in all patients.
Gastrointestinal adverse effects did not differ between the two study groups, and there was no evidence of GI bleeding. Quality of life also was similar between patients taking salsalate and those taking placebo.
The National Institute of Diabetes and Digestive and Kidney Diseases funded the study. Caraco Pharmaceutical Laboratories provided the salsalate and placebo, but it had no other role in the study.
The salicylate prodrug salsalate may be an effective add-on therapy for poorly controlled type 2 diabetes, according to a report published online July 1 in Annals of Internal Medicine.
In a 1-year randomized, controlled trial involving 286 patients already taking one to three medications for diabetes, adding daily oral salsalate consistently reduced hemoglobin A1c and fasting glucose levels, as well as improving other cardiometabolic risk factors, said Dr. Allison B. Goldfine of Harvard Medical School, Boston, and her associates.
The magnitude of the treatment effect was similar to that reported for other add-on diabetes therapies currently in use.
The treatment was well tolerated, no serious adverse events occurred, and there was no evidence of gastrointestinal adverse effects. However, some changes in renal function and cholesterol levels "require continued evaluation before salsalate can be recommended for widespread use in type 2 diabetes," the investigators noted.
Dr. Goldfine and her colleagues performed the clinical trial after proof-of-principle studies showed that salsalate reduced blood glucose, triglycerides, free fatty acids, and C-reactive protein concentrations; improved glucose utilization; and increased circulating insulin and adiponectin levels. Their trial was intended to assess the magnitude and durability of the drug’s glycemic efficacy, as well as its tolerability, when taken for 1 year.
The study patients treated at 21 U.S. sites were adults who were aged 75 years or younger and who had HbA1c levels of 7.0%-9.5% and were already being managed with lifestyle modification, metformin, insulin, secretagogues, and/or dipeptidyl peptidase-4 inhibitors, alone or in combination. They were randomly assigned to add three daily doses of either salsalate (3 g daily for 2 weeks, then 3.5 g daily) or matching placebo to their drug regimen for 48 weeks.
Patients maintained stable doses of lipid-lowering and hypertension medications whenever possible for the course of the study. They were followed frequently to assess safety factors, treatment adherence, and treatment response.
The primary efficacy outcome was change in HbA1c level at 48 weeks. The salsalate group’s mean level was 0.37 percentage points lower than the control group’s, a significant difference. Moreover, the difference between the two study groups in HbA1c levels was significant at all of the frequent assessments, including the initial assessment after just 4 weeks of treatment.
In addition, the mean HbA1c level was 0.33 percentage points lower in the salsalate group at the end of the study than it had been at baseline, a significant difference. In contrast, the mean HbA1c level was essentially unchanged over time in the control group, with a mean, nonsignificant increase of 0.04 percentage points at week 48.
The magnitude of the treatment benefit was greatest among patients who had the highest HbA1c levels at baseline. For every 1% increase in baseline HbA1c, the mean reduction in HbA1c was 0.43 percentage points, the investigators said.
At the conclusion of the study, more patients receiving salsalate (41%) had achieved 0.5 percentage points or greater decreases in HbA1c levels than those receiving placebo (23%).
Consistent with these changes in HbA1c, fasting glucose levels also showed significant improvement only in the patients receiving salsalate.
Reductions in and discontinuation of other diabetes medications also were significantly more frequent with salsalate (62%) than with placebo (13%). "Conversely, concomitant diabetes medications were increased and new therapies instituted more frequently for patients receiving placebo (87%) than those receiving salsalate (38%)," Dr. Goldfine and her associates said.
Among the active drug’s anti-inflammatory effects were decreases in circulating leukocytes, neutrophils, and lymphocytes.
Levels of adiponectin, a cardioprotective protein derived from adipocytes, rose by 27% in the salsalate group, compared with the placebo group. Levels of uric acid, which is associated with cardiometabolic disorders and the progression of renal insufficiency, dropped by 18% in the salsalate group, compared with the placebo group.
Salsalate also reduced triglyceride levels. However, the drug increased total and LDL cholesterol levels without altering HDL cholesterol levels. It also increased the urinary albumin-creatinine ratio and increased serum creatinine levels. And it was associated with a 1.3-kg increase in weight.
These modest adverse effects "warrant further assessment," the investigators said.
No serious adverse events were attributed to salsalate. However, the relative risk for mild hypoglycemia was six times greater when salsalate was added to sulfonylureas than when placebo was.
Mild tinnitus was reported by more patients receiving salsalate (11%) than placebo (5%), but it resolved in all patients.
Gastrointestinal adverse effects did not differ between the two study groups, and there was no evidence of GI bleeding. Quality of life also was similar between patients taking salsalate and those taking placebo.
The National Institute of Diabetes and Digestive and Kidney Diseases funded the study. Caraco Pharmaceutical Laboratories provided the salsalate and placebo, but it had no other role in the study.
The salicylate prodrug salsalate may be an effective add-on therapy for poorly controlled type 2 diabetes, according to a report published online July 1 in Annals of Internal Medicine.
In a 1-year randomized, controlled trial involving 286 patients already taking one to three medications for diabetes, adding daily oral salsalate consistently reduced hemoglobin A1c and fasting glucose levels, as well as improving other cardiometabolic risk factors, said Dr. Allison B. Goldfine of Harvard Medical School, Boston, and her associates.
The magnitude of the treatment effect was similar to that reported for other add-on diabetes therapies currently in use.
The treatment was well tolerated, no serious adverse events occurred, and there was no evidence of gastrointestinal adverse effects. However, some changes in renal function and cholesterol levels "require continued evaluation before salsalate can be recommended for widespread use in type 2 diabetes," the investigators noted.
Dr. Goldfine and her colleagues performed the clinical trial after proof-of-principle studies showed that salsalate reduced blood glucose, triglycerides, free fatty acids, and C-reactive protein concentrations; improved glucose utilization; and increased circulating insulin and adiponectin levels. Their trial was intended to assess the magnitude and durability of the drug’s glycemic efficacy, as well as its tolerability, when taken for 1 year.
The study patients treated at 21 U.S. sites were adults who were aged 75 years or younger and who had HbA1c levels of 7.0%-9.5% and were already being managed with lifestyle modification, metformin, insulin, secretagogues, and/or dipeptidyl peptidase-4 inhibitors, alone or in combination. They were randomly assigned to add three daily doses of either salsalate (3 g daily for 2 weeks, then 3.5 g daily) or matching placebo to their drug regimen for 48 weeks.
Patients maintained stable doses of lipid-lowering and hypertension medications whenever possible for the course of the study. They were followed frequently to assess safety factors, treatment adherence, and treatment response.
The primary efficacy outcome was change in HbA1c level at 48 weeks. The salsalate group’s mean level was 0.37 percentage points lower than the control group’s, a significant difference. Moreover, the difference between the two study groups in HbA1c levels was significant at all of the frequent assessments, including the initial assessment after just 4 weeks of treatment.
In addition, the mean HbA1c level was 0.33 percentage points lower in the salsalate group at the end of the study than it had been at baseline, a significant difference. In contrast, the mean HbA1c level was essentially unchanged over time in the control group, with a mean, nonsignificant increase of 0.04 percentage points at week 48.
The magnitude of the treatment benefit was greatest among patients who had the highest HbA1c levels at baseline. For every 1% increase in baseline HbA1c, the mean reduction in HbA1c was 0.43 percentage points, the investigators said.
At the conclusion of the study, more patients receiving salsalate (41%) had achieved 0.5 percentage points or greater decreases in HbA1c levels than those receiving placebo (23%).
Consistent with these changes in HbA1c, fasting glucose levels also showed significant improvement only in the patients receiving salsalate.
Reductions in and discontinuation of other diabetes medications also were significantly more frequent with salsalate (62%) than with placebo (13%). "Conversely, concomitant diabetes medications were increased and new therapies instituted more frequently for patients receiving placebo (87%) than those receiving salsalate (38%)," Dr. Goldfine and her associates said.
Among the active drug’s anti-inflammatory effects were decreases in circulating leukocytes, neutrophils, and lymphocytes.
Levels of adiponectin, a cardioprotective protein derived from adipocytes, rose by 27% in the salsalate group, compared with the placebo group. Levels of uric acid, which is associated with cardiometabolic disorders and the progression of renal insufficiency, dropped by 18% in the salsalate group, compared with the placebo group.
Salsalate also reduced triglyceride levels. However, the drug increased total and LDL cholesterol levels without altering HDL cholesterol levels. It also increased the urinary albumin-creatinine ratio and increased serum creatinine levels. And it was associated with a 1.3-kg increase in weight.
These modest adverse effects "warrant further assessment," the investigators said.
No serious adverse events were attributed to salsalate. However, the relative risk for mild hypoglycemia was six times greater when salsalate was added to sulfonylureas than when placebo was.
Mild tinnitus was reported by more patients receiving salsalate (11%) than placebo (5%), but it resolved in all patients.
Gastrointestinal adverse effects did not differ between the two study groups, and there was no evidence of GI bleeding. Quality of life also was similar between patients taking salsalate and those taking placebo.
The National Institute of Diabetes and Digestive and Kidney Diseases funded the study. Caraco Pharmaceutical Laboratories provided the salsalate and placebo, but it had no other role in the study.
FROM ANNALS OF INTERNAL MEDICINE
Major finding: After 48 weeks, mean HbA1c levels were 0.37 percentage points lower with salsalate than with placebo, a significant difference.
Data source: A randomized, controlled clinical trial comparing outcomes between 146 patients with type 2 diabetes who took daily salsalate and 140 who took matching placebo for 1 year.
Disclosures: The National Institute of Diabetes and Digestive and Kidney Diseases funded the study. Caraco Pharmaceutical Laboratories provided the salsalate and placebo but had no other role in the study.
A third of CHD patients may undergo unnecessary lipid testing
As many as one-third of patients with coronary heart disease who have met target low-density lipoprotein cholesterol levels undergo repeat lipid testing within months, without any further intensification of treatment, according to a large analysis published online July 1 in JAMA Internal Medicine.
Such patients are already being treated aggressively, and repeat lipid testing in them likely represents an overuse or waste of health care resources, said Dr. Salim S. Virani of the Health Policy and Quality Program, Michael E. DeBakey VA Medical Center Health Services Research and Development Center of Excellence, Houston, and his associates.
In a study involving nearly 28,000 CHD patients who made a primary care visit to a Veterans Affairs medical center or community-based outpatient clinic during a 1-year period, 12,686 such "redundant" lipid panels were performed at an average cost of about $16 each. "This is equivalent to $203,990 in annual costs for one VA network, and does not take into account the cost of the patient’s time to undergo lipid testing and the cost of the provider’s time to manage these results and notify the patient," the investigators said.
"Our results highlight areas to target for future quality improvement initiatives," they noted.
In the study, Dr. Virani and his colleagues identified 27,947 patients with CHD who were taking a variety of lipid-lowering medications and who had attained the guideline-recommended low-density lipoprotein (LDL) cholesterol target of less than 100 mg/dL. The mean age of these study subjects was 73 years, and most were white men. The prevalence of hypertension was 86%, and that of diabetes was 44%.
The study subjects had well-controlled lipid levels, with excellent mean baseline levels of LDL cholesterol (70 mg/dL), non-HDL cholesterol (94 mg/dL), triglycerides (123 mg/dL), and HDL cholesterol (43 mg/dL). Most (72%) were taking statins.
A total of 9,200 underwent repeat lipid panels within 11 months, without any intensification of their treatment. This ruled out any patients who might have been attempting to reach an even lower LDL cholesterol target of less than 70 mg/dL.
In these 9,200 patients, "it is likely that repeat lipid testing was performed without any clinical action," and therefore was redundant, the researchers said (JAMA Intern. Med. 2013 July 1 [doi: 10.1001/jamainternmed. 2013.8198]).
A total of 34% of the repeat lipid tests were done within 6 months of an index test, and 80% were done within 9 months. Their results were "strikingly similar" to those of the index lipid panels, which also "argues against major medication or therapeutic lifestyle changes as the drivers of repeat lipid testing."
A sensitivity analysis involving a subset of 13,114 patients who had optimal (below 70 mg/dL) LDL levels showed that 62% underwent repeat lipid testing, confirming that redundancy was commonplace even in these patients.
Patients who had concomitant hypertension or diabetes were the most likely to undergo repeat lipid panels, which "points toward a tendency of health care providers to order frequent laboratory testing in complex patients. Frequent lipid testing in these patients likely represents providers’ practice to order comprehensive laboratory tests (including lipid levels) rather than focusing on one clinical issue (e.g., ordering only glycated hemoglobin measurement to assess diabetes control)," Dr. Virani and his associates said.
"Repeat lipid testing likely provides a sense of comfort to the providers that they are being vigilant in following up on their patients with CHD, although a repeat lipid panel may not be indicated," they added.
Their study was limited in that it included few women, and minority races/ethnicities were underrepresented. The findings therefore may not be generalizable to these groups, the investigators said.
Dr. Virani reported no financial conflicts of interest; one of his associates reported ties to numerous industry sources.
This well-conceived study "delivers an important message regarding a type of waste that is likely widespread in health care and that goes under the radar because it involves a low-cost test," said Dr. Joseph P. Drozda Jr.
|
|
"It is precisely these low-cost, high-volume tests and procedures that need to be addressed if significant savings from reduction of waste are to be realized," he noted.
Dr. Drozda is at the Center for Innovative Care in Chesterfield, Mo. He reported no relevant financial conflicts of interest. These remarks were taken from his invited commentary accompanying Dr. Virani’s report (JAMA Intern. Med. 2013 July 1 [doi: 10.1001/jamainternmed.2013.6808]).
This well-conceived study "delivers an important message regarding a type of waste that is likely widespread in health care and that goes under the radar because it involves a low-cost test," said Dr. Joseph P. Drozda Jr.
|
|
"It is precisely these low-cost, high-volume tests and procedures that need to be addressed if significant savings from reduction of waste are to be realized," he noted.
Dr. Drozda is at the Center for Innovative Care in Chesterfield, Mo. He reported no relevant financial conflicts of interest. These remarks were taken from his invited commentary accompanying Dr. Virani’s report (JAMA Intern. Med. 2013 July 1 [doi: 10.1001/jamainternmed.2013.6808]).
This well-conceived study "delivers an important message regarding a type of waste that is likely widespread in health care and that goes under the radar because it involves a low-cost test," said Dr. Joseph P. Drozda Jr.
|
|
"It is precisely these low-cost, high-volume tests and procedures that need to be addressed if significant savings from reduction of waste are to be realized," he noted.
Dr. Drozda is at the Center for Innovative Care in Chesterfield, Mo. He reported no relevant financial conflicts of interest. These remarks were taken from his invited commentary accompanying Dr. Virani’s report (JAMA Intern. Med. 2013 July 1 [doi: 10.1001/jamainternmed.2013.6808]).
As many as one-third of patients with coronary heart disease who have met target low-density lipoprotein cholesterol levels undergo repeat lipid testing within months, without any further intensification of treatment, according to a large analysis published online July 1 in JAMA Internal Medicine.
Such patients are already being treated aggressively, and repeat lipid testing in them likely represents an overuse or waste of health care resources, said Dr. Salim S. Virani of the Health Policy and Quality Program, Michael E. DeBakey VA Medical Center Health Services Research and Development Center of Excellence, Houston, and his associates.
In a study involving nearly 28,000 CHD patients who made a primary care visit to a Veterans Affairs medical center or community-based outpatient clinic during a 1-year period, 12,686 such "redundant" lipid panels were performed at an average cost of about $16 each. "This is equivalent to $203,990 in annual costs for one VA network, and does not take into account the cost of the patient’s time to undergo lipid testing and the cost of the provider’s time to manage these results and notify the patient," the investigators said.
"Our results highlight areas to target for future quality improvement initiatives," they noted.
In the study, Dr. Virani and his colleagues identified 27,947 patients with CHD who were taking a variety of lipid-lowering medications and who had attained the guideline-recommended low-density lipoprotein (LDL) cholesterol target of less than 100 mg/dL. The mean age of these study subjects was 73 years, and most were white men. The prevalence of hypertension was 86%, and that of diabetes was 44%.
The study subjects had well-controlled lipid levels, with excellent mean baseline levels of LDL cholesterol (70 mg/dL), non-HDL cholesterol (94 mg/dL), triglycerides (123 mg/dL), and HDL cholesterol (43 mg/dL). Most (72%) were taking statins.
A total of 9,200 underwent repeat lipid panels within 11 months, without any intensification of their treatment. This ruled out any patients who might have been attempting to reach an even lower LDL cholesterol target of less than 70 mg/dL.
In these 9,200 patients, "it is likely that repeat lipid testing was performed without any clinical action," and therefore was redundant, the researchers said (JAMA Intern. Med. 2013 July 1 [doi: 10.1001/jamainternmed. 2013.8198]).
A total of 34% of the repeat lipid tests were done within 6 months of an index test, and 80% were done within 9 months. Their results were "strikingly similar" to those of the index lipid panels, which also "argues against major medication or therapeutic lifestyle changes as the drivers of repeat lipid testing."
A sensitivity analysis involving a subset of 13,114 patients who had optimal (below 70 mg/dL) LDL levels showed that 62% underwent repeat lipid testing, confirming that redundancy was commonplace even in these patients.
Patients who had concomitant hypertension or diabetes were the most likely to undergo repeat lipid panels, which "points toward a tendency of health care providers to order frequent laboratory testing in complex patients. Frequent lipid testing in these patients likely represents providers’ practice to order comprehensive laboratory tests (including lipid levels) rather than focusing on one clinical issue (e.g., ordering only glycated hemoglobin measurement to assess diabetes control)," Dr. Virani and his associates said.
"Repeat lipid testing likely provides a sense of comfort to the providers that they are being vigilant in following up on their patients with CHD, although a repeat lipid panel may not be indicated," they added.
Their study was limited in that it included few women, and minority races/ethnicities were underrepresented. The findings therefore may not be generalizable to these groups, the investigators said.
Dr. Virani reported no financial conflicts of interest; one of his associates reported ties to numerous industry sources.
As many as one-third of patients with coronary heart disease who have met target low-density lipoprotein cholesterol levels undergo repeat lipid testing within months, without any further intensification of treatment, according to a large analysis published online July 1 in JAMA Internal Medicine.
Such patients are already being treated aggressively, and repeat lipid testing in them likely represents an overuse or waste of health care resources, said Dr. Salim S. Virani of the Health Policy and Quality Program, Michael E. DeBakey VA Medical Center Health Services Research and Development Center of Excellence, Houston, and his associates.
In a study involving nearly 28,000 CHD patients who made a primary care visit to a Veterans Affairs medical center or community-based outpatient clinic during a 1-year period, 12,686 such "redundant" lipid panels were performed at an average cost of about $16 each. "This is equivalent to $203,990 in annual costs for one VA network, and does not take into account the cost of the patient’s time to undergo lipid testing and the cost of the provider’s time to manage these results and notify the patient," the investigators said.
"Our results highlight areas to target for future quality improvement initiatives," they noted.
In the study, Dr. Virani and his colleagues identified 27,947 patients with CHD who were taking a variety of lipid-lowering medications and who had attained the guideline-recommended low-density lipoprotein (LDL) cholesterol target of less than 100 mg/dL. The mean age of these study subjects was 73 years, and most were white men. The prevalence of hypertension was 86%, and that of diabetes was 44%.
The study subjects had well-controlled lipid levels, with excellent mean baseline levels of LDL cholesterol (70 mg/dL), non-HDL cholesterol (94 mg/dL), triglycerides (123 mg/dL), and HDL cholesterol (43 mg/dL). Most (72%) were taking statins.
A total of 9,200 underwent repeat lipid panels within 11 months, without any intensification of their treatment. This ruled out any patients who might have been attempting to reach an even lower LDL cholesterol target of less than 70 mg/dL.
In these 9,200 patients, "it is likely that repeat lipid testing was performed without any clinical action," and therefore was redundant, the researchers said (JAMA Intern. Med. 2013 July 1 [doi: 10.1001/jamainternmed. 2013.8198]).
A total of 34% of the repeat lipid tests were done within 6 months of an index test, and 80% were done within 9 months. Their results were "strikingly similar" to those of the index lipid panels, which also "argues against major medication or therapeutic lifestyle changes as the drivers of repeat lipid testing."
A sensitivity analysis involving a subset of 13,114 patients who had optimal (below 70 mg/dL) LDL levels showed that 62% underwent repeat lipid testing, confirming that redundancy was commonplace even in these patients.
Patients who had concomitant hypertension or diabetes were the most likely to undergo repeat lipid panels, which "points toward a tendency of health care providers to order frequent laboratory testing in complex patients. Frequent lipid testing in these patients likely represents providers’ practice to order comprehensive laboratory tests (including lipid levels) rather than focusing on one clinical issue (e.g., ordering only glycated hemoglobin measurement to assess diabetes control)," Dr. Virani and his associates said.
"Repeat lipid testing likely provides a sense of comfort to the providers that they are being vigilant in following up on their patients with CHD, although a repeat lipid panel may not be indicated," they added.
Their study was limited in that it included few women, and minority races/ethnicities were underrepresented. The findings therefore may not be generalizable to these groups, the investigators said.
Dr. Virani reported no financial conflicts of interest; one of his associates reported ties to numerous industry sources.
FROM JAMA INTERNAL MEDICINE
Major finding: A third of the study population, 9,200 patients, underwent redundant lipid testing within months of an index test showing they had already attained target LDL cholesterol levels.
Data source: An observational analysis of lipid testing results in 27,947 CHD patients who had LDL levels of 100 mg/dL or less.
Disclosures: Dr. Virani reported no financial conflicts of interest; one of his associates reported ties to numerous industry sources.
Anemia signals greater severity of celiac disease
Among patients with celiac disease, those who present with anemia have more severe clinical, serologic, and histologic features than do those who present with diarrhea, according to a report published online in Clinical Gastroenterology and Hepatology.
In particular, patients who presented with anemia and were found to have celiac disease had more severe damage of the small bowel mucosa and had higher levels of immune and inflammatory markers, compared with patients who presented with diarrhea and were found to have celiac disease, said Dr. Hussein Abu Daya of the Celiac Disease Center, Columbia University, New York, and his associates.
Patients who presented with anemia also were more likely to show reduced bone mineral density at the time of diagnosis.
At present, there is no standard method for assessing global celiac disease severity, as there is with, for example, inflammatory bowel disease. Until now, patients who had the classical presentation of diarrhea and malabsorption usually were assumed to be more severely affected than those with less typical presentations, the researchers noted.
To test this assumption, Dr. Abu Daya and his colleagues assessed the medical records of 727 patients who presented to their tertiary referral center for evaluation between 1990 and 2011. A total of 562 of these study subjects (77%) presented with diarrhea, and 165 (23%) presented with anemia, almost every case of which was attributed to iron deficiency.
The two study groups were remarkably similar. Both were predominantly female (about 70% of both groups) and had a similar average body mass index (about 23 kg/m2). The same percentage of subjects in each group initially had been diagnosed in childhood (7%), and the same percentage had a family history of celiac disease (21%).
However, patients who presented with anemia showed significantly more mucosal damage on histologic assessment than did those who presented with diarrhea. Severe villous atrophy was found in 53% of the first group, compared with only 34% of the latter group, the investigators said (Clin. Gastro. Hepatol. 2013 June 10 [doi: 10.1016/j.cgh.2013.05.030]).
Patients who presented with anemia also had higher erythrocyte sedimentation rates, indicating inflammation. The mean ESR was 24 mm/hr in the anemia group, compared with 10 mm/hr in the diarrhea group.
In addition, patients who presented with anemia had higher tissue transglutaminase antibody (anti-tTG) ratios (7 vs. 5).
The anemia group also showed more greatly decreased bone mineral density. Both osteopenia (56%) and osteoporosis (26%) were more common in the anemia group than in the diarrhea group (35% and 21%, respectively).
Derangements in lipid profiles – chiefly decreases in total cholesterol and high-density lipoprotein (HDL) cholesterol – also were more common in patients who presented with anemia. These are thought to be due mainly to malabsorption, steatorrhea, and, to a lesser extent, enhanced biliary lipid secretion, Dr. Abu Daya and his associates said.
Their findings highlight the need for an index of severity that is specific for celiac disease. Such an index would ideally include quality-of-life measures and mortality risk, which were not addressed in this study, they added.
No financial conflicts of interest were reported.
Among patients with celiac disease, those who present with anemia have more severe clinical, serologic, and histologic features than do those who present with diarrhea, according to a report published online in Clinical Gastroenterology and Hepatology.
In particular, patients who presented with anemia and were found to have celiac disease had more severe damage of the small bowel mucosa and had higher levels of immune and inflammatory markers, compared with patients who presented with diarrhea and were found to have celiac disease, said Dr. Hussein Abu Daya of the Celiac Disease Center, Columbia University, New York, and his associates.
Patients who presented with anemia also were more likely to show reduced bone mineral density at the time of diagnosis.
At present, there is no standard method for assessing global celiac disease severity, as there is with, for example, inflammatory bowel disease. Until now, patients who had the classical presentation of diarrhea and malabsorption usually were assumed to be more severely affected than those with less typical presentations, the researchers noted.
To test this assumption, Dr. Abu Daya and his colleagues assessed the medical records of 727 patients who presented to their tertiary referral center for evaluation between 1990 and 2011. A total of 562 of these study subjects (77%) presented with diarrhea, and 165 (23%) presented with anemia, almost every case of which was attributed to iron deficiency.
The two study groups were remarkably similar. Both were predominantly female (about 70% of both groups) and had a similar average body mass index (about 23 kg/m2). The same percentage of subjects in each group initially had been diagnosed in childhood (7%), and the same percentage had a family history of celiac disease (21%).
However, patients who presented with anemia showed significantly more mucosal damage on histologic assessment than did those who presented with diarrhea. Severe villous atrophy was found in 53% of the first group, compared with only 34% of the latter group, the investigators said (Clin. Gastro. Hepatol. 2013 June 10 [doi: 10.1016/j.cgh.2013.05.030]).
Patients who presented with anemia also had higher erythrocyte sedimentation rates, indicating inflammation. The mean ESR was 24 mm/hr in the anemia group, compared with 10 mm/hr in the diarrhea group.
In addition, patients who presented with anemia had higher tissue transglutaminase antibody (anti-tTG) ratios (7 vs. 5).
The anemia group also showed more greatly decreased bone mineral density. Both osteopenia (56%) and osteoporosis (26%) were more common in the anemia group than in the diarrhea group (35% and 21%, respectively).
Derangements in lipid profiles – chiefly decreases in total cholesterol and high-density lipoprotein (HDL) cholesterol – also were more common in patients who presented with anemia. These are thought to be due mainly to malabsorption, steatorrhea, and, to a lesser extent, enhanced biliary lipid secretion, Dr. Abu Daya and his associates said.
Their findings highlight the need for an index of severity that is specific for celiac disease. Such an index would ideally include quality-of-life measures and mortality risk, which were not addressed in this study, they added.
No financial conflicts of interest were reported.
Among patients with celiac disease, those who present with anemia have more severe clinical, serologic, and histologic features than do those who present with diarrhea, according to a report published online in Clinical Gastroenterology and Hepatology.
In particular, patients who presented with anemia and were found to have celiac disease had more severe damage of the small bowel mucosa and had higher levels of immune and inflammatory markers, compared with patients who presented with diarrhea and were found to have celiac disease, said Dr. Hussein Abu Daya of the Celiac Disease Center, Columbia University, New York, and his associates.
Patients who presented with anemia also were more likely to show reduced bone mineral density at the time of diagnosis.
At present, there is no standard method for assessing global celiac disease severity, as there is with, for example, inflammatory bowel disease. Until now, patients who had the classical presentation of diarrhea and malabsorption usually were assumed to be more severely affected than those with less typical presentations, the researchers noted.
To test this assumption, Dr. Abu Daya and his colleagues assessed the medical records of 727 patients who presented to their tertiary referral center for evaluation between 1990 and 2011. A total of 562 of these study subjects (77%) presented with diarrhea, and 165 (23%) presented with anemia, almost every case of which was attributed to iron deficiency.
The two study groups were remarkably similar. Both were predominantly female (about 70% of both groups) and had a similar average body mass index (about 23 kg/m2). The same percentage of subjects in each group initially had been diagnosed in childhood (7%), and the same percentage had a family history of celiac disease (21%).
However, patients who presented with anemia showed significantly more mucosal damage on histologic assessment than did those who presented with diarrhea. Severe villous atrophy was found in 53% of the first group, compared with only 34% of the latter group, the investigators said (Clin. Gastro. Hepatol. 2013 June 10 [doi: 10.1016/j.cgh.2013.05.030]).
Patients who presented with anemia also had higher erythrocyte sedimentation rates, indicating inflammation. The mean ESR was 24 mm/hr in the anemia group, compared with 10 mm/hr in the diarrhea group.
In addition, patients who presented with anemia had higher tissue transglutaminase antibody (anti-tTG) ratios (7 vs. 5).
The anemia group also showed more greatly decreased bone mineral density. Both osteopenia (56%) and osteoporosis (26%) were more common in the anemia group than in the diarrhea group (35% and 21%, respectively).
Derangements in lipid profiles – chiefly decreases in total cholesterol and high-density lipoprotein (HDL) cholesterol – also were more common in patients who presented with anemia. These are thought to be due mainly to malabsorption, steatorrhea, and, to a lesser extent, enhanced biliary lipid secretion, Dr. Abu Daya and his associates said.
Their findings highlight the need for an index of severity that is specific for celiac disease. Such an index would ideally include quality-of-life measures and mortality risk, which were not addressed in this study, they added.
No financial conflicts of interest were reported.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Major finding: Among patients with celiac disease who presented with anemia, 53% had severe villous atrophy and 56% had osteopenia; the corresponding percentages in those who presented with diarrhea were 34% and 35%.
Data source: A comparison of disease severity between 562 celiac disease patients who presented with diarrhea and 165 who presented with anemia to a single referral center during the period 1990-2011.
Disclosures: No financial conflicts of interest were reported.
Aspirin cuts risk of BRAF wild-type colorectal cancer
Regular aspirin use has been linked to a lower risk of BRAF wild-type, but not BRAF mutated, colorectal cancer, according to a report in the June 26 issue of JAMA.
The absolute difference in risk was considered modest, so further investigation is required to clarify the clinical implications of these study findings. But the results do indicate that BRAF status may someday serve as a marker of sensitivity to aspirin therapy, said Reiko Nishihara, Ph.D., of the Dana-Farber Cancer Institute and Harvard University, Boston, and her associates.
Activating mutations in the BRAF oncogene occur in 10%-15% of colorectal cancers, and are thought to play a role in the upregulation and synthesis of certain prostaglandins. Since aspirin is an antiprostaglandin, "we hypothesized that BRAF-mutant colonic cells might be less sensitive to the antitumor effects of aspirin, whereas BRAF wild-type neoplastic cells might be more susceptible to its antitumor effects," they said.
The study findings also could lead to new treatment strategies that are better tailored to tumor characteristics. And they "enhance understanding of the molecular pathogenesis of colorectal neoplasia and the mechanisms through which aspirin may exert its antineoplastic effects," the investigators noted.
Dr. Nishihara and her colleagues examined the association between aspirin use and colorectal cancer’s BRAF mutation status using data from two large national prospective cohort studies that tracked participants’ aspirin use beginning in the 1980s. They analyzed data on 82,095 women in the Nurses' Health Study (NHS) and 45,770 men in the Health Professionals Follow-Up Study (HPFS), in which numerous dietary and other exposures were monitored in detail at 2-year intervals.
"Our detailed, updated exposure data allowed us to control for the effects of potential confounding by other dietary and lifestyle factors implicated in colorectal carcinogenesis," they said.
The study participants used aspirin primarily to prevent cardiovascular disease and to treat arthritis, other musculoskeletal pain, and headache.
During 28 years (more than 3 million person-years) of follow-up, 1,226 of these subjects developed colorectal cancer. As expected, both men and women who used aspirin regularly showed a significantly lower risk of developing the disease than did aspirin nonusers.
DNA tissue was extracted from stored samples of tumor tissue so that BRAF status could be determined.
Aspirin use was associated with a significantly lower risk of BRAF-wild-type cancer. For this tumor, the age-adjusted incidence was 40.2 per 100,000 person-years among aspirin nonusers, compared with 30.5 per 100,000 person-years for aspirin users.
In contrast, aspirin use showed no relation to the risk of BRAF-mutated cancer. The age-adjusted incidence was 5.0 per 100,000 person-years among nonusers and 5.7 per 100,000 among aspirin users (JAMA 2013;309:2563-71).
In a sensitivity analysis that accounted for the concomitant use of cholesterol-lowering agents, antihypertensive medications, and NSAIDs, the results were unchanged.
Further investigation showed that the risk of BRAF-wild-type colorectal cancer decreased as the weekly dose of aspirin increased. In addition, this risk decreased as the duration of aspirin therapy increased.
In contrast, neither dose nor duration of aspirin therapy affected the risk for BRAF-mutated cancer.
"These findings support the hypothesis that BRAF-mutated cells may show resistance to the anticancer effects of aspirin due to upregulation of the [prostaglandin] pathway," Dr. Nishihara and her associates said.
This study was supported by the National Institutes of Health, the Bennett Family Fund for Targeted Therapies Research, and the National Colorectal Cancer Research Alliance. Dr. Nishihara reported no financial conflicts. An associate reported ties to Bayer Healthcare, Millenium Pharmaceuticals, Pfizer, and Pozen.
Chemoprevention to reduce the incidence of future colorectal cancer is a laudable goal so long as the benefits outweigh risks of the drug or drugs utilized. Aspirin, used mainly for cardiovascular disease prevention, has been shown in randomized controlled trials to also reduce risk for colorectal neoplasia, and mechanisms that drive that reduced risk might be COX-2 dependent or independent. The RAS/RAF/MAPK pathway, normally regulated via upstream receptor pathways such as EGFR, can upregulate COX-2 expression but becomes autonomous when KRAS or BRAF is mutationally activated to provide incessant signaling and perhaps high persistent levels of COX-2 that might not be diminished with aspirin.
Nishihara and colleagues surmised that wild-type BRAF tumors might be more sensitive and be prevented with regular aspirin usage, and in two large cohorts yielding 1226 incident colorectal cancers, demonstrated a lower risk of developing wild-type BRAF tumors but not mutant BRAF tumors. Increased weekly dose of aspirin and longer duration of aspirin usage were associated with reduced risk of wild-type BRAF cancers.
The clinical utility of this is unknown. Mutant BRAF segregates with sporadic CIMP-positive, microsatellite unstable tumors, which are often right-sided flat serrated lesions in the colon that may proliferate faster, are more resistant to 5-fluorouracil chemotherapy, but have an overall better survival prognosis compared to CIMP-negative or microsatellite stable tumors Theoretical selection for these tumors could occur with aspirin use. Tumors from Lynch syndrome patients demonstrate wild-type BRAF, and might explain this group's response to reduction of colorectal cancer risk with aspirin.
John M. Carethers, M.D., is the John G. Searle Professor and chair of the department of internal medicine, University of Michigan, Ann Arbor. He had no conflicts to disclose.
Chemoprevention to reduce the incidence of future colorectal cancer is a laudable goal so long as the benefits outweigh risks of the drug or drugs utilized. Aspirin, used mainly for cardiovascular disease prevention, has been shown in randomized controlled trials to also reduce risk for colorectal neoplasia, and mechanisms that drive that reduced risk might be COX-2 dependent or independent. The RAS/RAF/MAPK pathway, normally regulated via upstream receptor pathways such as EGFR, can upregulate COX-2 expression but becomes autonomous when KRAS or BRAF is mutationally activated to provide incessant signaling and perhaps high persistent levels of COX-2 that might not be diminished with aspirin.
Nishihara and colleagues surmised that wild-type BRAF tumors might be more sensitive and be prevented with regular aspirin usage, and in two large cohorts yielding 1226 incident colorectal cancers, demonstrated a lower risk of developing wild-type BRAF tumors but not mutant BRAF tumors. Increased weekly dose of aspirin and longer duration of aspirin usage were associated with reduced risk of wild-type BRAF cancers.
The clinical utility of this is unknown. Mutant BRAF segregates with sporadic CIMP-positive, microsatellite unstable tumors, which are often right-sided flat serrated lesions in the colon that may proliferate faster, are more resistant to 5-fluorouracil chemotherapy, but have an overall better survival prognosis compared to CIMP-negative or microsatellite stable tumors Theoretical selection for these tumors could occur with aspirin use. Tumors from Lynch syndrome patients demonstrate wild-type BRAF, and might explain this group's response to reduction of colorectal cancer risk with aspirin.
John M. Carethers, M.D., is the John G. Searle Professor and chair of the department of internal medicine, University of Michigan, Ann Arbor. He had no conflicts to disclose.
Chemoprevention to reduce the incidence of future colorectal cancer is a laudable goal so long as the benefits outweigh risks of the drug or drugs utilized. Aspirin, used mainly for cardiovascular disease prevention, has been shown in randomized controlled trials to also reduce risk for colorectal neoplasia, and mechanisms that drive that reduced risk might be COX-2 dependent or independent. The RAS/RAF/MAPK pathway, normally regulated via upstream receptor pathways such as EGFR, can upregulate COX-2 expression but becomes autonomous when KRAS or BRAF is mutationally activated to provide incessant signaling and perhaps high persistent levels of COX-2 that might not be diminished with aspirin.
Nishihara and colleagues surmised that wild-type BRAF tumors might be more sensitive and be prevented with regular aspirin usage, and in two large cohorts yielding 1226 incident colorectal cancers, demonstrated a lower risk of developing wild-type BRAF tumors but not mutant BRAF tumors. Increased weekly dose of aspirin and longer duration of aspirin usage were associated with reduced risk of wild-type BRAF cancers.
The clinical utility of this is unknown. Mutant BRAF segregates with sporadic CIMP-positive, microsatellite unstable tumors, which are often right-sided flat serrated lesions in the colon that may proliferate faster, are more resistant to 5-fluorouracil chemotherapy, but have an overall better survival prognosis compared to CIMP-negative or microsatellite stable tumors Theoretical selection for these tumors could occur with aspirin use. Tumors from Lynch syndrome patients demonstrate wild-type BRAF, and might explain this group's response to reduction of colorectal cancer risk with aspirin.
John M. Carethers, M.D., is the John G. Searle Professor and chair of the department of internal medicine, University of Michigan, Ann Arbor. He had no conflicts to disclose.
Regular aspirin use has been linked to a lower risk of BRAF wild-type, but not BRAF mutated, colorectal cancer, according to a report in the June 26 issue of JAMA.
The absolute difference in risk was considered modest, so further investigation is required to clarify the clinical implications of these study findings. But the results do indicate that BRAF status may someday serve as a marker of sensitivity to aspirin therapy, said Reiko Nishihara, Ph.D., of the Dana-Farber Cancer Institute and Harvard University, Boston, and her associates.
Activating mutations in the BRAF oncogene occur in 10%-15% of colorectal cancers, and are thought to play a role in the upregulation and synthesis of certain prostaglandins. Since aspirin is an antiprostaglandin, "we hypothesized that BRAF-mutant colonic cells might be less sensitive to the antitumor effects of aspirin, whereas BRAF wild-type neoplastic cells might be more susceptible to its antitumor effects," they said.
The study findings also could lead to new treatment strategies that are better tailored to tumor characteristics. And they "enhance understanding of the molecular pathogenesis of colorectal neoplasia and the mechanisms through which aspirin may exert its antineoplastic effects," the investigators noted.
Dr. Nishihara and her colleagues examined the association between aspirin use and colorectal cancer’s BRAF mutation status using data from two large national prospective cohort studies that tracked participants’ aspirin use beginning in the 1980s. They analyzed data on 82,095 women in the Nurses' Health Study (NHS) and 45,770 men in the Health Professionals Follow-Up Study (HPFS), in which numerous dietary and other exposures were monitored in detail at 2-year intervals.
"Our detailed, updated exposure data allowed us to control for the effects of potential confounding by other dietary and lifestyle factors implicated in colorectal carcinogenesis," they said.
The study participants used aspirin primarily to prevent cardiovascular disease and to treat arthritis, other musculoskeletal pain, and headache.
During 28 years (more than 3 million person-years) of follow-up, 1,226 of these subjects developed colorectal cancer. As expected, both men and women who used aspirin regularly showed a significantly lower risk of developing the disease than did aspirin nonusers.
DNA tissue was extracted from stored samples of tumor tissue so that BRAF status could be determined.
Aspirin use was associated with a significantly lower risk of BRAF-wild-type cancer. For this tumor, the age-adjusted incidence was 40.2 per 100,000 person-years among aspirin nonusers, compared with 30.5 per 100,000 person-years for aspirin users.
In contrast, aspirin use showed no relation to the risk of BRAF-mutated cancer. The age-adjusted incidence was 5.0 per 100,000 person-years among nonusers and 5.7 per 100,000 among aspirin users (JAMA 2013;309:2563-71).
In a sensitivity analysis that accounted for the concomitant use of cholesterol-lowering agents, antihypertensive medications, and NSAIDs, the results were unchanged.
Further investigation showed that the risk of BRAF-wild-type colorectal cancer decreased as the weekly dose of aspirin increased. In addition, this risk decreased as the duration of aspirin therapy increased.
In contrast, neither dose nor duration of aspirin therapy affected the risk for BRAF-mutated cancer.
"These findings support the hypothesis that BRAF-mutated cells may show resistance to the anticancer effects of aspirin due to upregulation of the [prostaglandin] pathway," Dr. Nishihara and her associates said.
This study was supported by the National Institutes of Health, the Bennett Family Fund for Targeted Therapies Research, and the National Colorectal Cancer Research Alliance. Dr. Nishihara reported no financial conflicts. An associate reported ties to Bayer Healthcare, Millenium Pharmaceuticals, Pfizer, and Pozen.
Regular aspirin use has been linked to a lower risk of BRAF wild-type, but not BRAF mutated, colorectal cancer, according to a report in the June 26 issue of JAMA.
The absolute difference in risk was considered modest, so further investigation is required to clarify the clinical implications of these study findings. But the results do indicate that BRAF status may someday serve as a marker of sensitivity to aspirin therapy, said Reiko Nishihara, Ph.D., of the Dana-Farber Cancer Institute and Harvard University, Boston, and her associates.
Activating mutations in the BRAF oncogene occur in 10%-15% of colorectal cancers, and are thought to play a role in the upregulation and synthesis of certain prostaglandins. Since aspirin is an antiprostaglandin, "we hypothesized that BRAF-mutant colonic cells might be less sensitive to the antitumor effects of aspirin, whereas BRAF wild-type neoplastic cells might be more susceptible to its antitumor effects," they said.
The study findings also could lead to new treatment strategies that are better tailored to tumor characteristics. And they "enhance understanding of the molecular pathogenesis of colorectal neoplasia and the mechanisms through which aspirin may exert its antineoplastic effects," the investigators noted.
Dr. Nishihara and her colleagues examined the association between aspirin use and colorectal cancer’s BRAF mutation status using data from two large national prospective cohort studies that tracked participants’ aspirin use beginning in the 1980s. They analyzed data on 82,095 women in the Nurses' Health Study (NHS) and 45,770 men in the Health Professionals Follow-Up Study (HPFS), in which numerous dietary and other exposures were monitored in detail at 2-year intervals.
"Our detailed, updated exposure data allowed us to control for the effects of potential confounding by other dietary and lifestyle factors implicated in colorectal carcinogenesis," they said.
The study participants used aspirin primarily to prevent cardiovascular disease and to treat arthritis, other musculoskeletal pain, and headache.
During 28 years (more than 3 million person-years) of follow-up, 1,226 of these subjects developed colorectal cancer. As expected, both men and women who used aspirin regularly showed a significantly lower risk of developing the disease than did aspirin nonusers.
DNA tissue was extracted from stored samples of tumor tissue so that BRAF status could be determined.
Aspirin use was associated with a significantly lower risk of BRAF-wild-type cancer. For this tumor, the age-adjusted incidence was 40.2 per 100,000 person-years among aspirin nonusers, compared with 30.5 per 100,000 person-years for aspirin users.
In contrast, aspirin use showed no relation to the risk of BRAF-mutated cancer. The age-adjusted incidence was 5.0 per 100,000 person-years among nonusers and 5.7 per 100,000 among aspirin users (JAMA 2013;309:2563-71).
In a sensitivity analysis that accounted for the concomitant use of cholesterol-lowering agents, antihypertensive medications, and NSAIDs, the results were unchanged.
Further investigation showed that the risk of BRAF-wild-type colorectal cancer decreased as the weekly dose of aspirin increased. In addition, this risk decreased as the duration of aspirin therapy increased.
In contrast, neither dose nor duration of aspirin therapy affected the risk for BRAF-mutated cancer.
"These findings support the hypothesis that BRAF-mutated cells may show resistance to the anticancer effects of aspirin due to upregulation of the [prostaglandin] pathway," Dr. Nishihara and her associates said.
This study was supported by the National Institutes of Health, the Bennett Family Fund for Targeted Therapies Research, and the National Colorectal Cancer Research Alliance. Dr. Nishihara reported no financial conflicts. An associate reported ties to Bayer Healthcare, Millenium Pharmaceuticals, Pfizer, and Pozen.
FROM JAMA
Major Finding: The incidence of BRAF wild-type colorectal cancer was significantly lower for adults who used aspirin regularly (30.5 per 100,000 person-years) compared with nonusers (40.2 per 100,000 person-years).
Data Source: An analysis of aspirin use among 127,865 men and women, of whom 1,226 developed colorectal cancer during 28 years of follow-up.
Disclosures: This study was supported by the National Institutes of Health, the Bennett Family Fund for Targeted Therapies Research, and the National Colorectal Cancer Research Alliance. Dr. Nishihara reported no financial conflicts. An associate reported ties to Bayer Healthcare, Millenium Pharmaceuticals, Pfizer, and Pozen.