User login
Low Testosterone May Up All-Cause Mortality Risk
Men with low testosterone levels appear to be at increased risk of death from all causes and to have shorter survival times than men with normal testosterone levels, reported Dr. Molly M. Shores of the departments of psychiatry and behavioral sciences at the University of Washington, Seattle, and her associates.
In a recent small study, the researchers had unexpectedly found that men with a low testosterone level had higher 6-month mortality than did those with a normal level who were of similar age and had comparable medical morbidity. “Given these unforeseen preliminary findings, we conducted the present retrospective cohort study to examine if repeatedly low serum testosterone levels were associated with increased mortality in a larger sample of middle-aged and elderly men with a longer follow-up, of up to 8 years,” they said.
Dr. Shores and her associates identified in a clinical database 858 male veterans, aged 40 years and older, who had undergone at least two measures of testosterone levels between 1994 and 1999 and had then been followed for a mean of 4.3 years. They matched the data on these subjects with data in a national Veterans Affairs death registry to obtain mortality information.
The reasons why these men had undergone testosterone testing were not available for analysis, but previous research has shown that, in general, the most common clinical indications are evaluation of sexual dysfunction, osteoporosis, genitourinary conditions, and endocrine conditions, the investigators said (Arch. Intern. Med. 2006;166:1660–5).
A total of 452 men—53% of the study population—had normal serum testosterone levels (defined as 250 ng/dL or higher), or normal free testosterone levels (defined as 0.75 ng/dL or higher). Another 240 men (28%) had equivocal levels, and 166 (19%) had low levels.
Because testosterone levels are known to decrease with both acute and chronic illness, the prevalences of chronic obstructive pulmonary disease, HIV infection, coronary artery disease, and hyperlipidemia were noted. There were no significant differences between the men with normal testosterone levels and those with low testosterone levels regarding these disorders or overall medical morbidity.
All-cause mortality was 20% in men with normal testosterone levels and 25% in those with equivocal levels, compared with 35% in men with low levels. After the data were adjusted to account for the covariates of age, race, body mass index, and other clinical factors, “low testosterone level continued to be associated with an increased mortality risk of 88% greater than in men with normal testosterone levels,” Dr. Shores and her associates said.
To control for the confounding influence of possible acute illness, the researchers conducted a further analysis excluding all subjects who died within 1 year of having their testosterone levels measured. In this subset of subjects, low testosterone levels were still associated with a 68% greater mortality risk, compared with normal levels.
The findings do not show that low testosterone levels directly raise mortality risk, because “a retrospective cohort study cannot establish a causal relationship,” they cautioned.
Men with low testosterone levels appear to be at increased risk of death from all causes and to have shorter survival times than men with normal testosterone levels, reported Dr. Molly M. Shores of the departments of psychiatry and behavioral sciences at the University of Washington, Seattle, and her associates.
In a recent small study, the researchers had unexpectedly found that men with a low testosterone level had higher 6-month mortality than did those with a normal level who were of similar age and had comparable medical morbidity. “Given these unforeseen preliminary findings, we conducted the present retrospective cohort study to examine if repeatedly low serum testosterone levels were associated with increased mortality in a larger sample of middle-aged and elderly men with a longer follow-up, of up to 8 years,” they said.
Dr. Shores and her associates identified in a clinical database 858 male veterans, aged 40 years and older, who had undergone at least two measures of testosterone levels between 1994 and 1999 and had then been followed for a mean of 4.3 years. They matched the data on these subjects with data in a national Veterans Affairs death registry to obtain mortality information.
The reasons why these men had undergone testosterone testing were not available for analysis, but previous research has shown that, in general, the most common clinical indications are evaluation of sexual dysfunction, osteoporosis, genitourinary conditions, and endocrine conditions, the investigators said (Arch. Intern. Med. 2006;166:1660–5).
A total of 452 men—53% of the study population—had normal serum testosterone levels (defined as 250 ng/dL or higher), or normal free testosterone levels (defined as 0.75 ng/dL or higher). Another 240 men (28%) had equivocal levels, and 166 (19%) had low levels.
Because testosterone levels are known to decrease with both acute and chronic illness, the prevalences of chronic obstructive pulmonary disease, HIV infection, coronary artery disease, and hyperlipidemia were noted. There were no significant differences between the men with normal testosterone levels and those with low testosterone levels regarding these disorders or overall medical morbidity.
All-cause mortality was 20% in men with normal testosterone levels and 25% in those with equivocal levels, compared with 35% in men with low levels. After the data were adjusted to account for the covariates of age, race, body mass index, and other clinical factors, “low testosterone level continued to be associated with an increased mortality risk of 88% greater than in men with normal testosterone levels,” Dr. Shores and her associates said.
To control for the confounding influence of possible acute illness, the researchers conducted a further analysis excluding all subjects who died within 1 year of having their testosterone levels measured. In this subset of subjects, low testosterone levels were still associated with a 68% greater mortality risk, compared with normal levels.
The findings do not show that low testosterone levels directly raise mortality risk, because “a retrospective cohort study cannot establish a causal relationship,” they cautioned.
Men with low testosterone levels appear to be at increased risk of death from all causes and to have shorter survival times than men with normal testosterone levels, reported Dr. Molly M. Shores of the departments of psychiatry and behavioral sciences at the University of Washington, Seattle, and her associates.
In a recent small study, the researchers had unexpectedly found that men with a low testosterone level had higher 6-month mortality than did those with a normal level who were of similar age and had comparable medical morbidity. “Given these unforeseen preliminary findings, we conducted the present retrospective cohort study to examine if repeatedly low serum testosterone levels were associated with increased mortality in a larger sample of middle-aged and elderly men with a longer follow-up, of up to 8 years,” they said.
Dr. Shores and her associates identified in a clinical database 858 male veterans, aged 40 years and older, who had undergone at least two measures of testosterone levels between 1994 and 1999 and had then been followed for a mean of 4.3 years. They matched the data on these subjects with data in a national Veterans Affairs death registry to obtain mortality information.
The reasons why these men had undergone testosterone testing were not available for analysis, but previous research has shown that, in general, the most common clinical indications are evaluation of sexual dysfunction, osteoporosis, genitourinary conditions, and endocrine conditions, the investigators said (Arch. Intern. Med. 2006;166:1660–5).
A total of 452 men—53% of the study population—had normal serum testosterone levels (defined as 250 ng/dL or higher), or normal free testosterone levels (defined as 0.75 ng/dL or higher). Another 240 men (28%) had equivocal levels, and 166 (19%) had low levels.
Because testosterone levels are known to decrease with both acute and chronic illness, the prevalences of chronic obstructive pulmonary disease, HIV infection, coronary artery disease, and hyperlipidemia were noted. There were no significant differences between the men with normal testosterone levels and those with low testosterone levels regarding these disorders or overall medical morbidity.
All-cause mortality was 20% in men with normal testosterone levels and 25% in those with equivocal levels, compared with 35% in men with low levels. After the data were adjusted to account for the covariates of age, race, body mass index, and other clinical factors, “low testosterone level continued to be associated with an increased mortality risk of 88% greater than in men with normal testosterone levels,” Dr. Shores and her associates said.
To control for the confounding influence of possible acute illness, the researchers conducted a further analysis excluding all subjects who died within 1 year of having their testosterone levels measured. In this subset of subjects, low testosterone levels were still associated with a 68% greater mortality risk, compared with normal levels.
The findings do not show that low testosterone levels directly raise mortality risk, because “a retrospective cohort study cannot establish a causal relationship,” they cautioned.
Classification Puts Childhood Vasculitis in Order
A new international classification for childhood vasculitis promises to improve existing criteria by bringing them into line with current clinical practice, reported Dr. Seza Ozen and fellow associates.
The new criteria, endorsed by the European League Against Rheumatism and the Paediatric Rheumatology European Society, were needed to update existing criteria and make them more specific.
It is the first internationally agreed-upon classification of the vasculitides observed in children, and was devised by gathering opinions from a wide range of pediatric rheumatologists and nephrologists from around the world, reported Dr. Ozen of Hacettepe University, Ankara, Turkey, and associates (Ann. Rheum. Dis. 2006;65:936–41).
The working group of experts updated the classification of Henoch-Schönlein purpura, Kawasaki disease, childhood polyarteritis nodosa, Wegener's granulomatosis, and Takayasu arteritis.
“It was agreed that vessel size would form the backbone of the criteria,” as had been decided at the Chapel Hill Consensus Conference for adult vasculitides, the investigators noted.
In addition, small-vessel vasculitis was divided according to the presence or absence of granulomas, “as this feature has important distinguishing implications.” A category of “other vasculitides” was added for those disorders in which an etiologic process was defined or which did not fit into existing categories.
For Henoch-Schönlein purpura, the primary change was that palpable purpura is now a mandatory criterion. Biopsy findings also were clarified, and since arthritis is common in the childhood type of this disorder, it is now included in the criteria.
The existing criteria for Kawasaki disease were retained. But since coronary artery disease is so important in this disorder, children who have typical echocardiographic changes can now be classified as having Kawasaki disease even if they do not fulfill four of the remaining criteria.
In addition, the perineal desquamation frequently noted in Kawasaki disease has been included under “changes in extremities.”
“Pediatric data during the last 10 years have shown that polyarteritis nodosa in children is hard to classify using the definitions applicable to adults,” Dr. Ozen and associates said. “We decided that abnormal angiography or a characteristic biopsy are mandatory criteria for the diagnosis of the disease” in children.
In addition, the criterion of hepatitis B surface antigen was eliminated, since this is no longer a major feature in pediatrics, “thanks to improving vaccination” practices. And since cutaneous disease is common in childhood polyarteritis nodosa, it has been added to the new criteria.
For Wegener's granulomatosis, subglottic, tracheal, or endobronchial stenosis has been added as a new criterion, because these are frequent features of the childhood disease.
Changes in technology prompted a revision of the criterion concerning radiographic imaging so that CT results are now included. And antineutrophil cytoplasmic antibody testing has also been added.
For Takayasu arteritis, the angiographic criterion was updated to reflect changes in technology, and it is now a mandatory criterion.
Hypertension is common and often is the only presenting sign in pediatric patients, so it has been included as a criterion even though it is nonspecific.
Overall, the new classification should benefit pediatricians and rheumatologists in practice, noted Dr. Ozen and associates. “We believe this was an important task, as appropriate classification criteria for vasculitis in children have been missing for far too long.
“The next step will be to validate these criteria using patients and control groups,” they said.
A new international classification for childhood vasculitis promises to improve existing criteria by bringing them into line with current clinical practice, reported Dr. Seza Ozen and fellow associates.
The new criteria, endorsed by the European League Against Rheumatism and the Paediatric Rheumatology European Society, were needed to update existing criteria and make them more specific.
It is the first internationally agreed-upon classification of the vasculitides observed in children, and was devised by gathering opinions from a wide range of pediatric rheumatologists and nephrologists from around the world, reported Dr. Ozen of Hacettepe University, Ankara, Turkey, and associates (Ann. Rheum. Dis. 2006;65:936–41).
The working group of experts updated the classification of Henoch-Schönlein purpura, Kawasaki disease, childhood polyarteritis nodosa, Wegener's granulomatosis, and Takayasu arteritis.
“It was agreed that vessel size would form the backbone of the criteria,” as had been decided at the Chapel Hill Consensus Conference for adult vasculitides, the investigators noted.
In addition, small-vessel vasculitis was divided according to the presence or absence of granulomas, “as this feature has important distinguishing implications.” A category of “other vasculitides” was added for those disorders in which an etiologic process was defined or which did not fit into existing categories.
For Henoch-Schönlein purpura, the primary change was that palpable purpura is now a mandatory criterion. Biopsy findings also were clarified, and since arthritis is common in the childhood type of this disorder, it is now included in the criteria.
The existing criteria for Kawasaki disease were retained. But since coronary artery disease is so important in this disorder, children who have typical echocardiographic changes can now be classified as having Kawasaki disease even if they do not fulfill four of the remaining criteria.
In addition, the perineal desquamation frequently noted in Kawasaki disease has been included under “changes in extremities.”
“Pediatric data during the last 10 years have shown that polyarteritis nodosa in children is hard to classify using the definitions applicable to adults,” Dr. Ozen and associates said. “We decided that abnormal angiography or a characteristic biopsy are mandatory criteria for the diagnosis of the disease” in children.
In addition, the criterion of hepatitis B surface antigen was eliminated, since this is no longer a major feature in pediatrics, “thanks to improving vaccination” practices. And since cutaneous disease is common in childhood polyarteritis nodosa, it has been added to the new criteria.
For Wegener's granulomatosis, subglottic, tracheal, or endobronchial stenosis has been added as a new criterion, because these are frequent features of the childhood disease.
Changes in technology prompted a revision of the criterion concerning radiographic imaging so that CT results are now included. And antineutrophil cytoplasmic antibody testing has also been added.
For Takayasu arteritis, the angiographic criterion was updated to reflect changes in technology, and it is now a mandatory criterion.
Hypertension is common and often is the only presenting sign in pediatric patients, so it has been included as a criterion even though it is nonspecific.
Overall, the new classification should benefit pediatricians and rheumatologists in practice, noted Dr. Ozen and associates. “We believe this was an important task, as appropriate classification criteria for vasculitis in children have been missing for far too long.
“The next step will be to validate these criteria using patients and control groups,” they said.
A new international classification for childhood vasculitis promises to improve existing criteria by bringing them into line with current clinical practice, reported Dr. Seza Ozen and fellow associates.
The new criteria, endorsed by the European League Against Rheumatism and the Paediatric Rheumatology European Society, were needed to update existing criteria and make them more specific.
It is the first internationally agreed-upon classification of the vasculitides observed in children, and was devised by gathering opinions from a wide range of pediatric rheumatologists and nephrologists from around the world, reported Dr. Ozen of Hacettepe University, Ankara, Turkey, and associates (Ann. Rheum. Dis. 2006;65:936–41).
The working group of experts updated the classification of Henoch-Schönlein purpura, Kawasaki disease, childhood polyarteritis nodosa, Wegener's granulomatosis, and Takayasu arteritis.
“It was agreed that vessel size would form the backbone of the criteria,” as had been decided at the Chapel Hill Consensus Conference for adult vasculitides, the investigators noted.
In addition, small-vessel vasculitis was divided according to the presence or absence of granulomas, “as this feature has important distinguishing implications.” A category of “other vasculitides” was added for those disorders in which an etiologic process was defined or which did not fit into existing categories.
For Henoch-Schönlein purpura, the primary change was that palpable purpura is now a mandatory criterion. Biopsy findings also were clarified, and since arthritis is common in the childhood type of this disorder, it is now included in the criteria.
The existing criteria for Kawasaki disease were retained. But since coronary artery disease is so important in this disorder, children who have typical echocardiographic changes can now be classified as having Kawasaki disease even if they do not fulfill four of the remaining criteria.
In addition, the perineal desquamation frequently noted in Kawasaki disease has been included under “changes in extremities.”
“Pediatric data during the last 10 years have shown that polyarteritis nodosa in children is hard to classify using the definitions applicable to adults,” Dr. Ozen and associates said. “We decided that abnormal angiography or a characteristic biopsy are mandatory criteria for the diagnosis of the disease” in children.
In addition, the criterion of hepatitis B surface antigen was eliminated, since this is no longer a major feature in pediatrics, “thanks to improving vaccination” practices. And since cutaneous disease is common in childhood polyarteritis nodosa, it has been added to the new criteria.
For Wegener's granulomatosis, subglottic, tracheal, or endobronchial stenosis has been added as a new criterion, because these are frequent features of the childhood disease.
Changes in technology prompted a revision of the criterion concerning radiographic imaging so that CT results are now included. And antineutrophil cytoplasmic antibody testing has also been added.
For Takayasu arteritis, the angiographic criterion was updated to reflect changes in technology, and it is now a mandatory criterion.
Hypertension is common and often is the only presenting sign in pediatric patients, so it has been included as a criterion even though it is nonspecific.
Overall, the new classification should benefit pediatricians and rheumatologists in practice, noted Dr. Ozen and associates. “We believe this was an important task, as appropriate classification criteria for vasculitis in children have been missing for far too long.
“The next step will be to validate these criteria using patients and control groups,” they said.
Low-Dose Ketamine Helps Resistant Depression
A single intravenous infusion of low-dose ketamine relieved treatment-resistant depression within 2 hours, and the “robust” response persisted for 1 week in a preliminary study of 18 patients, reported Dr. Carlos A. Zarate Jr. and his associates at the National Institute of Mental Health, Bethesda, Md.
“To our knowledge, there has never been a report of any other drug or somatic treatment (i.e., sleep deprivation, thyrotropin-releasing hormone, antidepressant, dexamethasone, or [electroconvulsive therapy]) that results in such a dramatic, rapid, and prolonged response with a single administration,” the researchers noted.
Previous trials of antidepressants have yielded response rates of 62% for bupropion, 63% for selective serotonin reuptake inhibitors, and 65% for venlafaxine (Effexor) at 8 weeks. In dramatic contrast, the response rate was 71% within 1 day in this trial of patients who were refractory to an average of six previous treatments, Dr. Zarate and his associates said (Arch. Gen. Psychiatry 2006;63:856–64).
Ketamine, which directly targets the N-methyl-D-aspartate receptor complex, is known to produce adverse effects at higher doses or when used for a prolonged time, so it is unlikely to be used widely in clinical settings. But these findings should lead to development of other, safer agents that similarly target the NMDA system.
In what they described as one of the first studies examining ketamine's antidepressant effects in humans, Dr. Zarate and his associates recruited 12 female and 6 male inpatients who were in good physical health but had recurrent major depressive disorder without psychotic features. The mean patient age was 47 years, the mean length of depressive illness was 24 years, the mean duration of the current depressive episode was 34 months, and the mean number of lifetime episodes of depression was 7.
Eleven patients (61%) had a lifetime comorbid diagnosis of anxiety, and 39% had a comorbid diagnosis of substance abuse or dependence but were certified to be free of drugs or alcohol for at least the preceding 30 days. All had a score of 18 or higher on the 21-item Hamilton Depression Rating Scale.
The subjects were randomly assigned to receive intravenous infusions of either ketamine dissolved in saline or a saline placebo, then were crossed over to the other infusion 1 week later. Outcomes were assessed before infusion and at 40, 80, 100, and 230 minutes afterward, as well as at 1, 2, 3, and 7 days afterward.
Clinical response was defined as a 50% or greater decrease in Hamilton score, and remission was defined as a Hamilton score of 7 or lower. One patient dropped out of the study for medical reasons after a placebo infusion.
A robust treatment response was noted within 110 minutes, and persisted for 7 days or more in 6 of 17 patients (35%). Remission occurred in 5 patients (29%). In contrast, there were no responses or remissions with the placebo infusion.
The prolonged effect was remarkable considering [ketamine's short, approximately 2-hour half-life, the researchers said.
There were no serious adverse effects, but patients did report transient perceptual disturbances, confusion, euphoria, and dizziness. Most such effects resolved within 80 minutes of infusion.
Initially, the study was meant to include 22 patients to adequately detect a treatment response. But interim data analysis showed a very large treatment effect with the first 18 patients that would have persisted even if no further responses occurred, so the trial was stopped at that point.
Although the sample size in this preliminary trial was relatively small, three different types of data analysis using five symptom measures (the Hamilton scale, the Beck Depression Inventory, the Brief Psychiatric Rating Scale, the Young Mania Rating Scale, and a visual analog scale) amply demonstrated the significance of the treatment effect, “and the effect sizes were very large at day 1 and moderate to large at day 7,” Dr. Zarate and his associates noted.
In a statement accompanying the publication of this report, Dr. Elias A. Zerhouni, director of the National Institutes of Health, said, “The public health implications of being able to treat major depression this quickly would be enormous.
“These new findings demonstrate the importance of developing new classes of antidepressants,” he added.
A single intravenous infusion of low-dose ketamine relieved treatment-resistant depression within 2 hours, and the “robust” response persisted for 1 week in a preliminary study of 18 patients, reported Dr. Carlos A. Zarate Jr. and his associates at the National Institute of Mental Health, Bethesda, Md.
“To our knowledge, there has never been a report of any other drug or somatic treatment (i.e., sleep deprivation, thyrotropin-releasing hormone, antidepressant, dexamethasone, or [electroconvulsive therapy]) that results in such a dramatic, rapid, and prolonged response with a single administration,” the researchers noted.
Previous trials of antidepressants have yielded response rates of 62% for bupropion, 63% for selective serotonin reuptake inhibitors, and 65% for venlafaxine (Effexor) at 8 weeks. In dramatic contrast, the response rate was 71% within 1 day in this trial of patients who were refractory to an average of six previous treatments, Dr. Zarate and his associates said (Arch. Gen. Psychiatry 2006;63:856–64).
Ketamine, which directly targets the N-methyl-D-aspartate receptor complex, is known to produce adverse effects at higher doses or when used for a prolonged time, so it is unlikely to be used widely in clinical settings. But these findings should lead to development of other, safer agents that similarly target the NMDA system.
In what they described as one of the first studies examining ketamine's antidepressant effects in humans, Dr. Zarate and his associates recruited 12 female and 6 male inpatients who were in good physical health but had recurrent major depressive disorder without psychotic features. The mean patient age was 47 years, the mean length of depressive illness was 24 years, the mean duration of the current depressive episode was 34 months, and the mean number of lifetime episodes of depression was 7.
Eleven patients (61%) had a lifetime comorbid diagnosis of anxiety, and 39% had a comorbid diagnosis of substance abuse or dependence but were certified to be free of drugs or alcohol for at least the preceding 30 days. All had a score of 18 or higher on the 21-item Hamilton Depression Rating Scale.
The subjects were randomly assigned to receive intravenous infusions of either ketamine dissolved in saline or a saline placebo, then were crossed over to the other infusion 1 week later. Outcomes were assessed before infusion and at 40, 80, 100, and 230 minutes afterward, as well as at 1, 2, 3, and 7 days afterward.
Clinical response was defined as a 50% or greater decrease in Hamilton score, and remission was defined as a Hamilton score of 7 or lower. One patient dropped out of the study for medical reasons after a placebo infusion.
A robust treatment response was noted within 110 minutes, and persisted for 7 days or more in 6 of 17 patients (35%). Remission occurred in 5 patients (29%). In contrast, there were no responses or remissions with the placebo infusion.
The prolonged effect was remarkable considering [ketamine's short, approximately 2-hour half-life, the researchers said.
There were no serious adverse effects, but patients did report transient perceptual disturbances, confusion, euphoria, and dizziness. Most such effects resolved within 80 minutes of infusion.
Initially, the study was meant to include 22 patients to adequately detect a treatment response. But interim data analysis showed a very large treatment effect with the first 18 patients that would have persisted even if no further responses occurred, so the trial was stopped at that point.
Although the sample size in this preliminary trial was relatively small, three different types of data analysis using five symptom measures (the Hamilton scale, the Beck Depression Inventory, the Brief Psychiatric Rating Scale, the Young Mania Rating Scale, and a visual analog scale) amply demonstrated the significance of the treatment effect, “and the effect sizes were very large at day 1 and moderate to large at day 7,” Dr. Zarate and his associates noted.
In a statement accompanying the publication of this report, Dr. Elias A. Zerhouni, director of the National Institutes of Health, said, “The public health implications of being able to treat major depression this quickly would be enormous.
“These new findings demonstrate the importance of developing new classes of antidepressants,” he added.
A single intravenous infusion of low-dose ketamine relieved treatment-resistant depression within 2 hours, and the “robust” response persisted for 1 week in a preliminary study of 18 patients, reported Dr. Carlos A. Zarate Jr. and his associates at the National Institute of Mental Health, Bethesda, Md.
“To our knowledge, there has never been a report of any other drug or somatic treatment (i.e., sleep deprivation, thyrotropin-releasing hormone, antidepressant, dexamethasone, or [electroconvulsive therapy]) that results in such a dramatic, rapid, and prolonged response with a single administration,” the researchers noted.
Previous trials of antidepressants have yielded response rates of 62% for bupropion, 63% for selective serotonin reuptake inhibitors, and 65% for venlafaxine (Effexor) at 8 weeks. In dramatic contrast, the response rate was 71% within 1 day in this trial of patients who were refractory to an average of six previous treatments, Dr. Zarate and his associates said (Arch. Gen. Psychiatry 2006;63:856–64).
Ketamine, which directly targets the N-methyl-D-aspartate receptor complex, is known to produce adverse effects at higher doses or when used for a prolonged time, so it is unlikely to be used widely in clinical settings. But these findings should lead to development of other, safer agents that similarly target the NMDA system.
In what they described as one of the first studies examining ketamine's antidepressant effects in humans, Dr. Zarate and his associates recruited 12 female and 6 male inpatients who were in good physical health but had recurrent major depressive disorder without psychotic features. The mean patient age was 47 years, the mean length of depressive illness was 24 years, the mean duration of the current depressive episode was 34 months, and the mean number of lifetime episodes of depression was 7.
Eleven patients (61%) had a lifetime comorbid diagnosis of anxiety, and 39% had a comorbid diagnosis of substance abuse or dependence but were certified to be free of drugs or alcohol for at least the preceding 30 days. All had a score of 18 or higher on the 21-item Hamilton Depression Rating Scale.
The subjects were randomly assigned to receive intravenous infusions of either ketamine dissolved in saline or a saline placebo, then were crossed over to the other infusion 1 week later. Outcomes were assessed before infusion and at 40, 80, 100, and 230 minutes afterward, as well as at 1, 2, 3, and 7 days afterward.
Clinical response was defined as a 50% or greater decrease in Hamilton score, and remission was defined as a Hamilton score of 7 or lower. One patient dropped out of the study for medical reasons after a placebo infusion.
A robust treatment response was noted within 110 minutes, and persisted for 7 days or more in 6 of 17 patients (35%). Remission occurred in 5 patients (29%). In contrast, there were no responses or remissions with the placebo infusion.
The prolonged effect was remarkable considering [ketamine's short, approximately 2-hour half-life, the researchers said.
There were no serious adverse effects, but patients did report transient perceptual disturbances, confusion, euphoria, and dizziness. Most such effects resolved within 80 minutes of infusion.
Initially, the study was meant to include 22 patients to adequately detect a treatment response. But interim data analysis showed a very large treatment effect with the first 18 patients that would have persisted even if no further responses occurred, so the trial was stopped at that point.
Although the sample size in this preliminary trial was relatively small, three different types of data analysis using five symptom measures (the Hamilton scale, the Beck Depression Inventory, the Brief Psychiatric Rating Scale, the Young Mania Rating Scale, and a visual analog scale) amply demonstrated the significance of the treatment effect, “and the effect sizes were very large at day 1 and moderate to large at day 7,” Dr. Zarate and his associates noted.
In a statement accompanying the publication of this report, Dr. Elias A. Zerhouni, director of the National Institutes of Health, said, “The public health implications of being able to treat major depression this quickly would be enormous.
“These new findings demonstrate the importance of developing new classes of antidepressants,” he added.
High-Carb, Low-Glycemic Index Diet Cuts Weight, Cardiac Risk
A high-carbohydrate, low-glycemic index diet both decreases fat mass and maximizes cardiovascular risk reduction, compared with three other weight-loss diets, reported Joanna McMillan-Price of the University of Sydney (Australia) and her associates.
A low-fat, high-carbohydrate diet is still considered the “best practice” among physicians. In contrast, high-protein and low-glycemic index diets have caught on with the public, but “clinicians and health professionals remain skeptical, calling for greater scientific evidence on which to base” their advice to patients who want to lose weight, the researchers said.
They conducted a 12-week, randomized, controlled trial of four weight-loss diets that all aimed for the same fat content (30% of total energy intake), the same moderate fiber content (30 g/day), and the same daily caloric goals (1,400 kcal for women and 1,900 kcal for men). The carbohydrate and protein contents of the four eating plans varied. The participants comprised 129 overweight young adults. The 98 women and 31 men were aged 18–40 years and had a body mass index (kg/m
Diet 1 was a conventional weight-loss eating plan: a high-carbohydrate (55% of energy intake) and average protein (15% of energy intake) diet that relied on high-glycemic index whole grains, such as fiber-rich breakfast cereals and breads. Diet 2 was a high-carbohydrate but low-glycemic index eating plan, which had the same proportions of protein and carbohydrates but relied on low- instead of high-glycemic index carbohydrates.
Diet 3 was a high-protein (25% of energy intake), low-carbohydrate (45% of energy intake) eating plan that relied on lean red meats and high-glycemic index whole grains. Diet 4 had the same proportions of protein and carbohydrates but relied on low- rather than high-glycemic index carbohydrates.
All four diets reduced weight by 4%–6%, and all reduced fat mass and waist circumference. Weight loss of 5% or more occurred in 31% of subjects on diet 1, in 56% on diet 2, in 66% on diet 3, and in 33% on diet 4.
All four groups reduced fat intake, but the high-carbohydrate groups ate the most fiber and consumed less fat overall, less saturated and polyunsaturated fats, and less cholesterol than did the high-protein groups. Thus, diet 2 produced “the best clinical outcomes, reducing both fat mass and LDL-cholesterol levels,” Ms. McMillan-Price and her associates said (Arch. Intern. Med. 2006;166:1466–75).
“Our findings suggest that dietary glycemic load, not just overall energy intake, influences weight loss and postprandial glycemia. … Diets based on low-glycemic index whole-grain products (in lieu of whole grains with a high glycemic index) maximize cardiovascular risk reduction” as well as weight loss.
In an accompanying editorial, Dr. Simin Liu of the department of epidemiology at University of California, Los Angeles, said that physicians should encourage patients' use of “glycemic index” and “glycemic load” concepts along with caloric count and nutrient composition, because these designations are superior to the “simple” or “complex” carbohydrate classifications in predicting glucose and insulin responses.
“We need to teach our patients to identify low-glycemic index foods within different food groups. Typically, foods with a low degree of starch gelatinization, such as pasta, and those containing a high level of viscous soluble fiber, such as whole grain barley, oats, and rye, have slower rates of digestion and lower glycemic index values,” Dr. Liu noted (Arch. Intern. Med. 2006;166:1438–9).
“Without any drastic change in regular dietary habits, for example, one can simply replace high-glycemic index grains with low-glycemic index grains, and starchy vegetables with less starchy ones, and cut down on soft drinks that are often poor in nutrients yet high in glycemic load,” Dr. Liu added.
A high-carbohydrate, low-glycemic index diet both decreases fat mass and maximizes cardiovascular risk reduction, compared with three other weight-loss diets, reported Joanna McMillan-Price of the University of Sydney (Australia) and her associates.
A low-fat, high-carbohydrate diet is still considered the “best practice” among physicians. In contrast, high-protein and low-glycemic index diets have caught on with the public, but “clinicians and health professionals remain skeptical, calling for greater scientific evidence on which to base” their advice to patients who want to lose weight, the researchers said.
They conducted a 12-week, randomized, controlled trial of four weight-loss diets that all aimed for the same fat content (30% of total energy intake), the same moderate fiber content (30 g/day), and the same daily caloric goals (1,400 kcal for women and 1,900 kcal for men). The carbohydrate and protein contents of the four eating plans varied. The participants comprised 129 overweight young adults. The 98 women and 31 men were aged 18–40 years and had a body mass index (kg/m
Diet 1 was a conventional weight-loss eating plan: a high-carbohydrate (55% of energy intake) and average protein (15% of energy intake) diet that relied on high-glycemic index whole grains, such as fiber-rich breakfast cereals and breads. Diet 2 was a high-carbohydrate but low-glycemic index eating plan, which had the same proportions of protein and carbohydrates but relied on low- instead of high-glycemic index carbohydrates.
Diet 3 was a high-protein (25% of energy intake), low-carbohydrate (45% of energy intake) eating plan that relied on lean red meats and high-glycemic index whole grains. Diet 4 had the same proportions of protein and carbohydrates but relied on low- rather than high-glycemic index carbohydrates.
All four diets reduced weight by 4%–6%, and all reduced fat mass and waist circumference. Weight loss of 5% or more occurred in 31% of subjects on diet 1, in 56% on diet 2, in 66% on diet 3, and in 33% on diet 4.
All four groups reduced fat intake, but the high-carbohydrate groups ate the most fiber and consumed less fat overall, less saturated and polyunsaturated fats, and less cholesterol than did the high-protein groups. Thus, diet 2 produced “the best clinical outcomes, reducing both fat mass and LDL-cholesterol levels,” Ms. McMillan-Price and her associates said (Arch. Intern. Med. 2006;166:1466–75).
“Our findings suggest that dietary glycemic load, not just overall energy intake, influences weight loss and postprandial glycemia. … Diets based on low-glycemic index whole-grain products (in lieu of whole grains with a high glycemic index) maximize cardiovascular risk reduction” as well as weight loss.
In an accompanying editorial, Dr. Simin Liu of the department of epidemiology at University of California, Los Angeles, said that physicians should encourage patients' use of “glycemic index” and “glycemic load” concepts along with caloric count and nutrient composition, because these designations are superior to the “simple” or “complex” carbohydrate classifications in predicting glucose and insulin responses.
“We need to teach our patients to identify low-glycemic index foods within different food groups. Typically, foods with a low degree of starch gelatinization, such as pasta, and those containing a high level of viscous soluble fiber, such as whole grain barley, oats, and rye, have slower rates of digestion and lower glycemic index values,” Dr. Liu noted (Arch. Intern. Med. 2006;166:1438–9).
“Without any drastic change in regular dietary habits, for example, one can simply replace high-glycemic index grains with low-glycemic index grains, and starchy vegetables with less starchy ones, and cut down on soft drinks that are often poor in nutrients yet high in glycemic load,” Dr. Liu added.
A high-carbohydrate, low-glycemic index diet both decreases fat mass and maximizes cardiovascular risk reduction, compared with three other weight-loss diets, reported Joanna McMillan-Price of the University of Sydney (Australia) and her associates.
A low-fat, high-carbohydrate diet is still considered the “best practice” among physicians. In contrast, high-protein and low-glycemic index diets have caught on with the public, but “clinicians and health professionals remain skeptical, calling for greater scientific evidence on which to base” their advice to patients who want to lose weight, the researchers said.
They conducted a 12-week, randomized, controlled trial of four weight-loss diets that all aimed for the same fat content (30% of total energy intake), the same moderate fiber content (30 g/day), and the same daily caloric goals (1,400 kcal for women and 1,900 kcal for men). The carbohydrate and protein contents of the four eating plans varied. The participants comprised 129 overweight young adults. The 98 women and 31 men were aged 18–40 years and had a body mass index (kg/m
Diet 1 was a conventional weight-loss eating plan: a high-carbohydrate (55% of energy intake) and average protein (15% of energy intake) diet that relied on high-glycemic index whole grains, such as fiber-rich breakfast cereals and breads. Diet 2 was a high-carbohydrate but low-glycemic index eating plan, which had the same proportions of protein and carbohydrates but relied on low- instead of high-glycemic index carbohydrates.
Diet 3 was a high-protein (25% of energy intake), low-carbohydrate (45% of energy intake) eating plan that relied on lean red meats and high-glycemic index whole grains. Diet 4 had the same proportions of protein and carbohydrates but relied on low- rather than high-glycemic index carbohydrates.
All four diets reduced weight by 4%–6%, and all reduced fat mass and waist circumference. Weight loss of 5% or more occurred in 31% of subjects on diet 1, in 56% on diet 2, in 66% on diet 3, and in 33% on diet 4.
All four groups reduced fat intake, but the high-carbohydrate groups ate the most fiber and consumed less fat overall, less saturated and polyunsaturated fats, and less cholesterol than did the high-protein groups. Thus, diet 2 produced “the best clinical outcomes, reducing both fat mass and LDL-cholesterol levels,” Ms. McMillan-Price and her associates said (Arch. Intern. Med. 2006;166:1466–75).
“Our findings suggest that dietary glycemic load, not just overall energy intake, influences weight loss and postprandial glycemia. … Diets based on low-glycemic index whole-grain products (in lieu of whole grains with a high glycemic index) maximize cardiovascular risk reduction” as well as weight loss.
In an accompanying editorial, Dr. Simin Liu of the department of epidemiology at University of California, Los Angeles, said that physicians should encourage patients' use of “glycemic index” and “glycemic load” concepts along with caloric count and nutrient composition, because these designations are superior to the “simple” or “complex” carbohydrate classifications in predicting glucose and insulin responses.
“We need to teach our patients to identify low-glycemic index foods within different food groups. Typically, foods with a low degree of starch gelatinization, such as pasta, and those containing a high level of viscous soluble fiber, such as whole grain barley, oats, and rye, have slower rates of digestion and lower glycemic index values,” Dr. Liu noted (Arch. Intern. Med. 2006;166:1438–9).
“Without any drastic change in regular dietary habits, for example, one can simply replace high-glycemic index grains with low-glycemic index grains, and starchy vegetables with less starchy ones, and cut down on soft drinks that are often poor in nutrients yet high in glycemic load,” Dr. Liu added.
Menopausal Changes Linked to Depression
The “changing hormonal milieu” of menopause is strongly associated with new-onset major depression as well as depressive symptoms in women with no history of mood disturbance, reported Ellen W. Freeman, Ph.D., of the departments of ob.gyn. and psychiatry at the University of Pennsylvania, Philadelphia, and her associates in the Penn Ovarian Aging Study.
Women are significantly more likely to develop a depressive disorder when their levels of estradiol fluctuate, levels of FSH and LH increase, and levels of inhibin B decrease, as happens during the transition to menopause. It appears that the hormonal changes characteristic of ovarian aging produce “destabilizing effects” that contribute to depression, the investigators said (Arch. Gen. Psychiatry 2006;63:375–82).
This finding should make a substantial contribution to what has been only “limited evidence” in the literature about mood symptoms in the perimenopausal years. “Whether mood symptoms increase in the perimenopausal years and whether the occurrence of depressed mood is independently associated with ovarian changes or is secondary to vasomotor or other bothersome symptoms” has been controversial, they noted.
Dr. Freeman and her associates examined the issue by assessing fluctuations in reproductive hormone levels in 231 premenopausal women aged 35–47 years at baseline who were followed for 8 years. During that interval, 43% of the women entered the transition to menopause.
Hormone assays were conducted in 10 assessment periods, the first 6 at 8-month intervals. Blood samples were collected at the start of menstrual cycles. Depressive symptoms were assessed using the CES-D (Center for Epidemiological Studies-Depression) scale; either the PRIME-MD (Primary Care Evaluation of Mental Disorders) or the PHQ (Patient Health Questionnaire) was used to detect major depressive disorder.
A total of 116 women (50%) were found to have depressive symptoms on the CES-D during follow-up. Of these, 16 women had depressive symptoms on two consecutive assessments and 35 had them on three or more consecutive assessments.
Of the 231 women, 59 (26%) were found to have depressive disorders on the PRIME-MD or PHQ; 26 had major depressive disorder and 33 had other depressive disorders. Nine of the women had depressive disorders on two consecutive assessments and four had them on three or more consecutive assessments.
A total of 108 women (47%) showed no depressive symptoms on either measure, Dr. Freeman and her associates said.
Changes in individual women's levels of FSH, LH, and inhibin B were significantly associated with depressive symptoms and with major depression. Similarly, variability in a woman's mean levels of estradiol, FSH, and LH also were linked to depression and depressive symptoms. “On average, the women were 4.58 times more likely to have higher FSH levels … 3 times more likely to have higher LH levels … and 63% more likely to have lower inhibin B levels … at the time of high [depression] scores,” compared with the time before high scores, the investigators said.
After the data were adjusted for several other depression risk factors, including change in employment status or marital status, the researchers found that a woman was, on average, more than five times “more likely to be in menopausal transition at the time of reporting high [depression] scores than she was before the onset of depressive symptoms.”
The “strongest risk factor for the new onset of diagnosed depressive disorders was the increased variability of estradiol (around the woman's own mean levels) at the time of the diagnosed disorder,” Dr. Freeman and her associates said.
However, other health and demographic factors also significantly affected depression risk, “confirming … the multifactorial nature of depressive symptoms.” These factors included hot flashes, body mass index, smoking status, and the presence or absence of PMS.
The 'strongestrisk factor forthe new onset of [depression] was the increased variability of estradiol.' DR. FREEMAN
The “changing hormonal milieu” of menopause is strongly associated with new-onset major depression as well as depressive symptoms in women with no history of mood disturbance, reported Ellen W. Freeman, Ph.D., of the departments of ob.gyn. and psychiatry at the University of Pennsylvania, Philadelphia, and her associates in the Penn Ovarian Aging Study.
Women are significantly more likely to develop a depressive disorder when their levels of estradiol fluctuate, levels of FSH and LH increase, and levels of inhibin B decrease, as happens during the transition to menopause. It appears that the hormonal changes characteristic of ovarian aging produce “destabilizing effects” that contribute to depression, the investigators said (Arch. Gen. Psychiatry 2006;63:375–82).
This finding should make a substantial contribution to what has been only “limited evidence” in the literature about mood symptoms in the perimenopausal years. “Whether mood symptoms increase in the perimenopausal years and whether the occurrence of depressed mood is independently associated with ovarian changes or is secondary to vasomotor or other bothersome symptoms” has been controversial, they noted.
Dr. Freeman and her associates examined the issue by assessing fluctuations in reproductive hormone levels in 231 premenopausal women aged 35–47 years at baseline who were followed for 8 years. During that interval, 43% of the women entered the transition to menopause.
Hormone assays were conducted in 10 assessment periods, the first 6 at 8-month intervals. Blood samples were collected at the start of menstrual cycles. Depressive symptoms were assessed using the CES-D (Center for Epidemiological Studies-Depression) scale; either the PRIME-MD (Primary Care Evaluation of Mental Disorders) or the PHQ (Patient Health Questionnaire) was used to detect major depressive disorder.
A total of 116 women (50%) were found to have depressive symptoms on the CES-D during follow-up. Of these, 16 women had depressive symptoms on two consecutive assessments and 35 had them on three or more consecutive assessments.
Of the 231 women, 59 (26%) were found to have depressive disorders on the PRIME-MD or PHQ; 26 had major depressive disorder and 33 had other depressive disorders. Nine of the women had depressive disorders on two consecutive assessments and four had them on three or more consecutive assessments.
A total of 108 women (47%) showed no depressive symptoms on either measure, Dr. Freeman and her associates said.
Changes in individual women's levels of FSH, LH, and inhibin B were significantly associated with depressive symptoms and with major depression. Similarly, variability in a woman's mean levels of estradiol, FSH, and LH also were linked to depression and depressive symptoms. “On average, the women were 4.58 times more likely to have higher FSH levels … 3 times more likely to have higher LH levels … and 63% more likely to have lower inhibin B levels … at the time of high [depression] scores,” compared with the time before high scores, the investigators said.
After the data were adjusted for several other depression risk factors, including change in employment status or marital status, the researchers found that a woman was, on average, more than five times “more likely to be in menopausal transition at the time of reporting high [depression] scores than she was before the onset of depressive symptoms.”
The “strongest risk factor for the new onset of diagnosed depressive disorders was the increased variability of estradiol (around the woman's own mean levels) at the time of the diagnosed disorder,” Dr. Freeman and her associates said.
However, other health and demographic factors also significantly affected depression risk, “confirming … the multifactorial nature of depressive symptoms.” These factors included hot flashes, body mass index, smoking status, and the presence or absence of PMS.
The 'strongestrisk factor forthe new onset of [depression] was the increased variability of estradiol.' DR. FREEMAN
The “changing hormonal milieu” of menopause is strongly associated with new-onset major depression as well as depressive symptoms in women with no history of mood disturbance, reported Ellen W. Freeman, Ph.D., of the departments of ob.gyn. and psychiatry at the University of Pennsylvania, Philadelphia, and her associates in the Penn Ovarian Aging Study.
Women are significantly more likely to develop a depressive disorder when their levels of estradiol fluctuate, levels of FSH and LH increase, and levels of inhibin B decrease, as happens during the transition to menopause. It appears that the hormonal changes characteristic of ovarian aging produce “destabilizing effects” that contribute to depression, the investigators said (Arch. Gen. Psychiatry 2006;63:375–82).
This finding should make a substantial contribution to what has been only “limited evidence” in the literature about mood symptoms in the perimenopausal years. “Whether mood symptoms increase in the perimenopausal years and whether the occurrence of depressed mood is independently associated with ovarian changes or is secondary to vasomotor or other bothersome symptoms” has been controversial, they noted.
Dr. Freeman and her associates examined the issue by assessing fluctuations in reproductive hormone levels in 231 premenopausal women aged 35–47 years at baseline who were followed for 8 years. During that interval, 43% of the women entered the transition to menopause.
Hormone assays were conducted in 10 assessment periods, the first 6 at 8-month intervals. Blood samples were collected at the start of menstrual cycles. Depressive symptoms were assessed using the CES-D (Center for Epidemiological Studies-Depression) scale; either the PRIME-MD (Primary Care Evaluation of Mental Disorders) or the PHQ (Patient Health Questionnaire) was used to detect major depressive disorder.
A total of 116 women (50%) were found to have depressive symptoms on the CES-D during follow-up. Of these, 16 women had depressive symptoms on two consecutive assessments and 35 had them on three or more consecutive assessments.
Of the 231 women, 59 (26%) were found to have depressive disorders on the PRIME-MD or PHQ; 26 had major depressive disorder and 33 had other depressive disorders. Nine of the women had depressive disorders on two consecutive assessments and four had them on three or more consecutive assessments.
A total of 108 women (47%) showed no depressive symptoms on either measure, Dr. Freeman and her associates said.
Changes in individual women's levels of FSH, LH, and inhibin B were significantly associated with depressive symptoms and with major depression. Similarly, variability in a woman's mean levels of estradiol, FSH, and LH also were linked to depression and depressive symptoms. “On average, the women were 4.58 times more likely to have higher FSH levels … 3 times more likely to have higher LH levels … and 63% more likely to have lower inhibin B levels … at the time of high [depression] scores,” compared with the time before high scores, the investigators said.
After the data were adjusted for several other depression risk factors, including change in employment status or marital status, the researchers found that a woman was, on average, more than five times “more likely to be in menopausal transition at the time of reporting high [depression] scores than she was before the onset of depressive symptoms.”
The “strongest risk factor for the new onset of diagnosed depressive disorders was the increased variability of estradiol (around the woman's own mean levels) at the time of the diagnosed disorder,” Dr. Freeman and her associates said.
However, other health and demographic factors also significantly affected depression risk, “confirming … the multifactorial nature of depressive symptoms.” These factors included hot flashes, body mass index, smoking status, and the presence or absence of PMS.
The 'strongestrisk factor forthe new onset of [depression] was the increased variability of estradiol.' DR. FREEMAN
High Copper and Fat Intake Accelerates Cognitive Decline
High dietary copper intake markedly accelerated the rate of cognitive decline in people whose diet was also high in saturated and trans fats, reported Dr. Martha Clare Morris of Rush University Medical Center, Chicago, and her associates.
In their analysis of data on 3,718 community residents who were enrolled in the Chicago Health and Aging Project (CHAP), the increase in the rate of cognitive decline “for the high-fat consumers whose total copper intake was in the top 20% (more than 1.6 mg/day) was equivalent to 19 more years of age.” This is “an extraordinarily large estimate of effect,” the researchers said.
Previous studies using data from the CHAP study population had shown that subjects who consumed high levels of saturated or trans fats had two to three times the risk of incident Alzheimer's disease and more rapid cognitive decline than people whose diets were lower in those fats. After noting the results of animal and other human studies that suggested dietary copper may induce the accumulation of amyloid-beta in the brain and cause memory deficits, Dr. Morris and her associates looked at the data on copper intake in the CHAP population.
The subjects were 65 years and older at entry into the study, and were assessed using four different measures of cognitive function at 3- and 6-year follow-up. Among the 604 subjects (16% of the entire cohort) who consumed a diet high in saturated and trans fats, there was a 143% increase in the rate of cognitive decline for those in the highest quintile of copper intake (median 2.75 mg/day), compared with those in the lowest quintile (median 0.88 mg/day).
In contrast, there was no association between copper intake and cognitive decline in subjects who had lower consumption of saturated and trans fats, the investigators said (Arch. Neurol. 2006;63:1085–8).
Copper, zinc, and iron are essential for normal brain function, but the “dyshomeostasis of these metals is thought to play a central role in the formation and neurotoxicity of amyloid-beta and neurofibrillary tangles,” they noted.
In this study, the link with accelerated cognitive decline was specific to copper. Zinc and iron levels showed no interactions with dietary fats.
In a further analysis of data on the subset of 602 subjects who took vitamin supplements containing copper, high copper intake again was associated with faster cognitive decline, but only in those whose diets were high in saturated and trans fats. These results “must be viewed with caution” because the study design was observational rather than prospective, and “the supporting evidence on this topic is limited,” Dr. Morris and her associates said.
High dietary copper intake markedly accelerated the rate of cognitive decline in people whose diet was also high in saturated and trans fats, reported Dr. Martha Clare Morris of Rush University Medical Center, Chicago, and her associates.
In their analysis of data on 3,718 community residents who were enrolled in the Chicago Health and Aging Project (CHAP), the increase in the rate of cognitive decline “for the high-fat consumers whose total copper intake was in the top 20% (more than 1.6 mg/day) was equivalent to 19 more years of age.” This is “an extraordinarily large estimate of effect,” the researchers said.
Previous studies using data from the CHAP study population had shown that subjects who consumed high levels of saturated or trans fats had two to three times the risk of incident Alzheimer's disease and more rapid cognitive decline than people whose diets were lower in those fats. After noting the results of animal and other human studies that suggested dietary copper may induce the accumulation of amyloid-beta in the brain and cause memory deficits, Dr. Morris and her associates looked at the data on copper intake in the CHAP population.
The subjects were 65 years and older at entry into the study, and were assessed using four different measures of cognitive function at 3- and 6-year follow-up. Among the 604 subjects (16% of the entire cohort) who consumed a diet high in saturated and trans fats, there was a 143% increase in the rate of cognitive decline for those in the highest quintile of copper intake (median 2.75 mg/day), compared with those in the lowest quintile (median 0.88 mg/day).
In contrast, there was no association between copper intake and cognitive decline in subjects who had lower consumption of saturated and trans fats, the investigators said (Arch. Neurol. 2006;63:1085–8).
Copper, zinc, and iron are essential for normal brain function, but the “dyshomeostasis of these metals is thought to play a central role in the formation and neurotoxicity of amyloid-beta and neurofibrillary tangles,” they noted.
In this study, the link with accelerated cognitive decline was specific to copper. Zinc and iron levels showed no interactions with dietary fats.
In a further analysis of data on the subset of 602 subjects who took vitamin supplements containing copper, high copper intake again was associated with faster cognitive decline, but only in those whose diets were high in saturated and trans fats. These results “must be viewed with caution” because the study design was observational rather than prospective, and “the supporting evidence on this topic is limited,” Dr. Morris and her associates said.
High dietary copper intake markedly accelerated the rate of cognitive decline in people whose diet was also high in saturated and trans fats, reported Dr. Martha Clare Morris of Rush University Medical Center, Chicago, and her associates.
In their analysis of data on 3,718 community residents who were enrolled in the Chicago Health and Aging Project (CHAP), the increase in the rate of cognitive decline “for the high-fat consumers whose total copper intake was in the top 20% (more than 1.6 mg/day) was equivalent to 19 more years of age.” This is “an extraordinarily large estimate of effect,” the researchers said.
Previous studies using data from the CHAP study population had shown that subjects who consumed high levels of saturated or trans fats had two to three times the risk of incident Alzheimer's disease and more rapid cognitive decline than people whose diets were lower in those fats. After noting the results of animal and other human studies that suggested dietary copper may induce the accumulation of amyloid-beta in the brain and cause memory deficits, Dr. Morris and her associates looked at the data on copper intake in the CHAP population.
The subjects were 65 years and older at entry into the study, and were assessed using four different measures of cognitive function at 3- and 6-year follow-up. Among the 604 subjects (16% of the entire cohort) who consumed a diet high in saturated and trans fats, there was a 143% increase in the rate of cognitive decline for those in the highest quintile of copper intake (median 2.75 mg/day), compared with those in the lowest quintile (median 0.88 mg/day).
In contrast, there was no association between copper intake and cognitive decline in subjects who had lower consumption of saturated and trans fats, the investigators said (Arch. Neurol. 2006;63:1085–8).
Copper, zinc, and iron are essential for normal brain function, but the “dyshomeostasis of these metals is thought to play a central role in the formation and neurotoxicity of amyloid-beta and neurofibrillary tangles,” they noted.
In this study, the link with accelerated cognitive decline was specific to copper. Zinc and iron levels showed no interactions with dietary fats.
In a further analysis of data on the subset of 602 subjects who took vitamin supplements containing copper, high copper intake again was associated with faster cognitive decline, but only in those whose diets were high in saturated and trans fats. These results “must be viewed with caution” because the study design was observational rather than prospective, and “the supporting evidence on this topic is limited,” Dr. Morris and her associates said.
Possible Biomarker for Preclinical AD Found
Cerebrospinal fluid levels of amyloid-beta 42 may be a biomarker for the early, asymptomatic phase of Alzheimer's disease—a long-awaited leap forward in the quest for preclinical diagnosis, reported Dr. Elaine R. Peskind of the University of Washington, Seattle, and her associates.
Adults who carry the apolipoprotein E4 allele but are cognitively normal show a marked decline in cerebrospinal fluid (CSF) levels of amyloid-beta 42 (Aβ42), presumably because the protein is precipitating out of the CSF and being deposited in plaques within the brain parenchyma, Dr. Peskind and her colleagues noted.
This decline of Aβ42 in cerebrospinal fluid appears to begin in early adulthood and to rapidly accelerate between the ages of 50 and 60 years in apo E4 carriers, long before clinical manifestations of Alzheimer's disease (AD) typically appear.
The finding bolsters the theory that Aβ42 deposition in the brain is a key initiating factor in the pathogenesis of AD, the researchers said. The findings may also point the way to new therapies and preventive strategies (Arch. Neurol. 2006;63:936–9).
Dr. Peskind and her associates assessed both the apo E genotype and CSF concentrations of Aβ42 in 184 healthy adults aged 21–88 years. The 94 men and 90 women had normal cognition and function. Those with the apo E allele not only had lower levels of Aβ42, but their levels also declined in a linear fashion as age increased. CSF levels also dropped off precipitously between the ages of 50 and 60 years. In contrast, subjects who did not carry the apo E4 allele showed a slight rise in Aβ42 levels until age 50 and then a slight and slow decline afterward.
Further research is needed, but researchers and clinicians should note that “therapeutic strategies aimed at prevention of AD may need to be applied in early midlife or even younger ages to have maximal effect on amyloid deposition,” the researchers concluded.
In an editorial, Dr. Roger N. Rosenberg of the University of Texas Southwestern Medical Center, Dallas, said the findings suggest that treatment should target “soluble Aβ and tau levels rather than insoluble plaques and tangles” (Arch. Neurol. 2006;63:926–8).
The “plaques and tangles that have captivated our visual attention for a century may not be the key targets for effective therapies after all,” said Dr. Rosenberg.
Cerebrospinal fluid levels of amyloid-beta 42 may be a biomarker for the early, asymptomatic phase of Alzheimer's disease—a long-awaited leap forward in the quest for preclinical diagnosis, reported Dr. Elaine R. Peskind of the University of Washington, Seattle, and her associates.
Adults who carry the apolipoprotein E4 allele but are cognitively normal show a marked decline in cerebrospinal fluid (CSF) levels of amyloid-beta 42 (Aβ42), presumably because the protein is precipitating out of the CSF and being deposited in plaques within the brain parenchyma, Dr. Peskind and her colleagues noted.
This decline of Aβ42 in cerebrospinal fluid appears to begin in early adulthood and to rapidly accelerate between the ages of 50 and 60 years in apo E4 carriers, long before clinical manifestations of Alzheimer's disease (AD) typically appear.
The finding bolsters the theory that Aβ42 deposition in the brain is a key initiating factor in the pathogenesis of AD, the researchers said. The findings may also point the way to new therapies and preventive strategies (Arch. Neurol. 2006;63:936–9).
Dr. Peskind and her associates assessed both the apo E genotype and CSF concentrations of Aβ42 in 184 healthy adults aged 21–88 years. The 94 men and 90 women had normal cognition and function. Those with the apo E allele not only had lower levels of Aβ42, but their levels also declined in a linear fashion as age increased. CSF levels also dropped off precipitously between the ages of 50 and 60 years. In contrast, subjects who did not carry the apo E4 allele showed a slight rise in Aβ42 levels until age 50 and then a slight and slow decline afterward.
Further research is needed, but researchers and clinicians should note that “therapeutic strategies aimed at prevention of AD may need to be applied in early midlife or even younger ages to have maximal effect on amyloid deposition,” the researchers concluded.
In an editorial, Dr. Roger N. Rosenberg of the University of Texas Southwestern Medical Center, Dallas, said the findings suggest that treatment should target “soluble Aβ and tau levels rather than insoluble plaques and tangles” (Arch. Neurol. 2006;63:926–8).
The “plaques and tangles that have captivated our visual attention for a century may not be the key targets for effective therapies after all,” said Dr. Rosenberg.
Cerebrospinal fluid levels of amyloid-beta 42 may be a biomarker for the early, asymptomatic phase of Alzheimer's disease—a long-awaited leap forward in the quest for preclinical diagnosis, reported Dr. Elaine R. Peskind of the University of Washington, Seattle, and her associates.
Adults who carry the apolipoprotein E4 allele but are cognitively normal show a marked decline in cerebrospinal fluid (CSF) levels of amyloid-beta 42 (Aβ42), presumably because the protein is precipitating out of the CSF and being deposited in plaques within the brain parenchyma, Dr. Peskind and her colleagues noted.
This decline of Aβ42 in cerebrospinal fluid appears to begin in early adulthood and to rapidly accelerate between the ages of 50 and 60 years in apo E4 carriers, long before clinical manifestations of Alzheimer's disease (AD) typically appear.
The finding bolsters the theory that Aβ42 deposition in the brain is a key initiating factor in the pathogenesis of AD, the researchers said. The findings may also point the way to new therapies and preventive strategies (Arch. Neurol. 2006;63:936–9).
Dr. Peskind and her associates assessed both the apo E genotype and CSF concentrations of Aβ42 in 184 healthy adults aged 21–88 years. The 94 men and 90 women had normal cognition and function. Those with the apo E allele not only had lower levels of Aβ42, but their levels also declined in a linear fashion as age increased. CSF levels also dropped off precipitously between the ages of 50 and 60 years. In contrast, subjects who did not carry the apo E4 allele showed a slight rise in Aβ42 levels until age 50 and then a slight and slow decline afterward.
Further research is needed, but researchers and clinicians should note that “therapeutic strategies aimed at prevention of AD may need to be applied in early midlife or even younger ages to have maximal effect on amyloid deposition,” the researchers concluded.
In an editorial, Dr. Roger N. Rosenberg of the University of Texas Southwestern Medical Center, Dallas, said the findings suggest that treatment should target “soluble Aβ and tau levels rather than insoluble plaques and tangles” (Arch. Neurol. 2006;63:926–8).
The “plaques and tangles that have captivated our visual attention for a century may not be the key targets for effective therapies after all,” said Dr. Rosenberg.
Nonhormonal Treatments for Hot Flashes Rated Not So Hot
Despite the avid interest in finding nonhormonal therapies for menopausal hot flashes, most alternative treatments have demonstrated only limited efficacy, and their safety remains in question, according to a systematic review of the literature.
Dr. Heidi D. Nelson and her associates at Oregon Health and Science University, Portland, identified all randomized, placebo-controlled trials of nonhormonal treatments for hot flashes in the English literature and compared the efficacy and adverse effects of agents other than estrogens, progestins, progesterone, or androgens.
From an initial screening of 4,249 abstracts, they narrowed their focus to 43 trials with adequate study designs. However, even these trials were often flawed by high dropout rates, small study samples, short follow-up periods, and methodologic failings, they noted (JAMA 2006;295:2057–71).
The selected studies included 10 that assessed antidepressants, 10 assessing clonidine, 6 assessing other prescription drugs, and 17 assessing isoflavone extracts.
Eleven of the trials included women with breast cancer, many of whom were receiving tamoxifen. This is a population in whom hot flashes are particularly common and for whom estrogen therapy is contraindicated, the researchers said.
A metaanalysis was conducted using 24 of the 43 studies.
Overall, there was some evidence that selective serotonin reuptake inhibitors, serotonin norepinephrine reuptake inhibitors, clonidine, and gabapentin reduce the severity and frequency of hot flashes. However, none of these agents approached the effectiveness of hormone therapy.
“Although these therapies may be most useful for highly symptomatic women who cannot take estrogen, they are not optimal choices for most women.” Their safety as treatments for hot flashes has not been adequately studied, and the adverse effects they cause as well as their cost will make their use prohibitive for many women, Dr. Nelson and her associates said.
The evidence for soy isoflavone extracts was contradictory, “even among the largest and highest quality trials,” they noted. No evidence supported the efficacy of red clover isoflavone extracts.
Despite the avid interest in finding nonhormonal therapies for menopausal hot flashes, most alternative treatments have demonstrated only limited efficacy, and their safety remains in question, according to a systematic review of the literature.
Dr. Heidi D. Nelson and her associates at Oregon Health and Science University, Portland, identified all randomized, placebo-controlled trials of nonhormonal treatments for hot flashes in the English literature and compared the efficacy and adverse effects of agents other than estrogens, progestins, progesterone, or androgens.
From an initial screening of 4,249 abstracts, they narrowed their focus to 43 trials with adequate study designs. However, even these trials were often flawed by high dropout rates, small study samples, short follow-up periods, and methodologic failings, they noted (JAMA 2006;295:2057–71).
The selected studies included 10 that assessed antidepressants, 10 assessing clonidine, 6 assessing other prescription drugs, and 17 assessing isoflavone extracts.
Eleven of the trials included women with breast cancer, many of whom were receiving tamoxifen. This is a population in whom hot flashes are particularly common and for whom estrogen therapy is contraindicated, the researchers said.
A metaanalysis was conducted using 24 of the 43 studies.
Overall, there was some evidence that selective serotonin reuptake inhibitors, serotonin norepinephrine reuptake inhibitors, clonidine, and gabapentin reduce the severity and frequency of hot flashes. However, none of these agents approached the effectiveness of hormone therapy.
“Although these therapies may be most useful for highly symptomatic women who cannot take estrogen, they are not optimal choices for most women.” Their safety as treatments for hot flashes has not been adequately studied, and the adverse effects they cause as well as their cost will make their use prohibitive for many women, Dr. Nelson and her associates said.
The evidence for soy isoflavone extracts was contradictory, “even among the largest and highest quality trials,” they noted. No evidence supported the efficacy of red clover isoflavone extracts.
Despite the avid interest in finding nonhormonal therapies for menopausal hot flashes, most alternative treatments have demonstrated only limited efficacy, and their safety remains in question, according to a systematic review of the literature.
Dr. Heidi D. Nelson and her associates at Oregon Health and Science University, Portland, identified all randomized, placebo-controlled trials of nonhormonal treatments for hot flashes in the English literature and compared the efficacy and adverse effects of agents other than estrogens, progestins, progesterone, or androgens.
From an initial screening of 4,249 abstracts, they narrowed their focus to 43 trials with adequate study designs. However, even these trials were often flawed by high dropout rates, small study samples, short follow-up periods, and methodologic failings, they noted (JAMA 2006;295:2057–71).
The selected studies included 10 that assessed antidepressants, 10 assessing clonidine, 6 assessing other prescription drugs, and 17 assessing isoflavone extracts.
Eleven of the trials included women with breast cancer, many of whom were receiving tamoxifen. This is a population in whom hot flashes are particularly common and for whom estrogen therapy is contraindicated, the researchers said.
A metaanalysis was conducted using 24 of the 43 studies.
Overall, there was some evidence that selective serotonin reuptake inhibitors, serotonin norepinephrine reuptake inhibitors, clonidine, and gabapentin reduce the severity and frequency of hot flashes. However, none of these agents approached the effectiveness of hormone therapy.
“Although these therapies may be most useful for highly symptomatic women who cannot take estrogen, they are not optimal choices for most women.” Their safety as treatments for hot flashes has not been adequately studied, and the adverse effects they cause as well as their cost will make their use prohibitive for many women, Dr. Nelson and her associates said.
The evidence for soy isoflavone extracts was contradictory, “even among the largest and highest quality trials,” they noted. No evidence supported the efficacy of red clover isoflavone extracts.
Weight Gain in Adulthood Tied to Breast Cancer Risk
Women who gain weight either in early adulthood or after menopause are at increased risk for postmenopausal breast cancer, compared with women who maintain a stable weight, reported Dr. A. Heather Eliassen of Harvard Medical School and her associates in the Nurses' Health Study.
Moreover, women who lose weight after menopause decrease their breast cancer risk (JAMA 2006;296:193–201).
The researchers based these conclusions on a prospective analysis of a subset of 49,514 women participating in the Nurses' Health Study, an ongoing survey of women nurses who were premenopausal when they enrolled in 1976 and have been followed since then. All the subjects for this analysis were postmenopausal. Weight change during two time periods—after age 18 and after menopause—was examined.
Compared with women who maintained a stable weight after age 18, those who gained at least 25 kg were at increased risk of developing breast cancer, with an adjusted relative risk of 1.45. Similarly, compared with women who maintained a stable weight after menopause, those who gained at least 10 kg were at increased risk of developing breast cancer, with an adjusted relative risk of 1.18.
Conversely, weight loss during either of those time periods was linked to a decreased risk of breast cancer. However, since relatively few women lost weight, particularly after menopause, “more follow-up is needed to confirm our findings [regarding weight loss] and characterize the benefits more precisely,” Dr. Eliassen and her associates said.
The calculated incidence rate of breast cancer in women who gained at least 25 kg after age 18 was 429 cases per 100,000 person-years, compared with 296 cases in women with stable weight. The calculated incidence rate of breast cancer in women who gained at least 10 kg after menopause was 400 cases per 100,000 person-years, compared with 339 cases in women with stable weight.
“In addition, we estimated that 15% of postmenopausal breast cancer cases in our population may be attributable to weight gain of 2 kg or more since age 18 years, and 4.4% attributable to weight gain of 2 kg or more since menopause,” the researchers said.
These calculations suggest that weight gain during either time period “contributes substantially” to breast cancer incidence, and that many cases of the disease could be avoided by maintaining weight throughout adulthood.
Women who gain weight either in early adulthood or after menopause are at increased risk for postmenopausal breast cancer, compared with women who maintain a stable weight, reported Dr. A. Heather Eliassen of Harvard Medical School and her associates in the Nurses' Health Study.
Moreover, women who lose weight after menopause decrease their breast cancer risk (JAMA 2006;296:193–201).
The researchers based these conclusions on a prospective analysis of a subset of 49,514 women participating in the Nurses' Health Study, an ongoing survey of women nurses who were premenopausal when they enrolled in 1976 and have been followed since then. All the subjects for this analysis were postmenopausal. Weight change during two time periods—after age 18 and after menopause—was examined.
Compared with women who maintained a stable weight after age 18, those who gained at least 25 kg were at increased risk of developing breast cancer, with an adjusted relative risk of 1.45. Similarly, compared with women who maintained a stable weight after menopause, those who gained at least 10 kg were at increased risk of developing breast cancer, with an adjusted relative risk of 1.18.
Conversely, weight loss during either of those time periods was linked to a decreased risk of breast cancer. However, since relatively few women lost weight, particularly after menopause, “more follow-up is needed to confirm our findings [regarding weight loss] and characterize the benefits more precisely,” Dr. Eliassen and her associates said.
The calculated incidence rate of breast cancer in women who gained at least 25 kg after age 18 was 429 cases per 100,000 person-years, compared with 296 cases in women with stable weight. The calculated incidence rate of breast cancer in women who gained at least 10 kg after menopause was 400 cases per 100,000 person-years, compared with 339 cases in women with stable weight.
“In addition, we estimated that 15% of postmenopausal breast cancer cases in our population may be attributable to weight gain of 2 kg or more since age 18 years, and 4.4% attributable to weight gain of 2 kg or more since menopause,” the researchers said.
These calculations suggest that weight gain during either time period “contributes substantially” to breast cancer incidence, and that many cases of the disease could be avoided by maintaining weight throughout adulthood.
Women who gain weight either in early adulthood or after menopause are at increased risk for postmenopausal breast cancer, compared with women who maintain a stable weight, reported Dr. A. Heather Eliassen of Harvard Medical School and her associates in the Nurses' Health Study.
Moreover, women who lose weight after menopause decrease their breast cancer risk (JAMA 2006;296:193–201).
The researchers based these conclusions on a prospective analysis of a subset of 49,514 women participating in the Nurses' Health Study, an ongoing survey of women nurses who were premenopausal when they enrolled in 1976 and have been followed since then. All the subjects for this analysis were postmenopausal. Weight change during two time periods—after age 18 and after menopause—was examined.
Compared with women who maintained a stable weight after age 18, those who gained at least 25 kg were at increased risk of developing breast cancer, with an adjusted relative risk of 1.45. Similarly, compared with women who maintained a stable weight after menopause, those who gained at least 10 kg were at increased risk of developing breast cancer, with an adjusted relative risk of 1.18.
Conversely, weight loss during either of those time periods was linked to a decreased risk of breast cancer. However, since relatively few women lost weight, particularly after menopause, “more follow-up is needed to confirm our findings [regarding weight loss] and characterize the benefits more precisely,” Dr. Eliassen and her associates said.
The calculated incidence rate of breast cancer in women who gained at least 25 kg after age 18 was 429 cases per 100,000 person-years, compared with 296 cases in women with stable weight. The calculated incidence rate of breast cancer in women who gained at least 10 kg after menopause was 400 cases per 100,000 person-years, compared with 339 cases in women with stable weight.
“In addition, we estimated that 15% of postmenopausal breast cancer cases in our population may be attributable to weight gain of 2 kg or more since age 18 years, and 4.4% attributable to weight gain of 2 kg or more since menopause,” the researchers said.
These calculations suggest that weight gain during either time period “contributes substantially” to breast cancer incidence, and that many cases of the disease could be avoided by maintaining weight throughout adulthood.
Behavior Change Stressed in New AHA Guidelines
The American Heart Association's updated guidelines on cardiovascular health for Americans are moving beyond diet to lifestyle.
The guidelines, which were last issued in 2000, were revised after a panel of nutrition and cardiovascular disease experts reviewed the scientific literature that had been published in the intervening 6 years.
“The key message of the [updated] recommendations is to focus on long-term, permanent changes in how we eat and live. The best way to lower cardiovascular risk is to combine physical activity with heart-healthy eating habits, coupled with weight control and avoiding tobacco products,” said Dr. Alice H. Lichtenstein, who is the chair of the association's nutrition committee, in a statement accompanying release of the new guidelines.
The new recommendations stress balancing the number of calories consumed with the amount of energy expended. Thirty minutes or more of physical activity per day is recommended, even if it is broken up into small increments.
“Achieving a physically active lifestyle requires effective time management, with a particular focus on reducing sedentary activities such as screen time (e.g., watching television, surfing the Web, playing computer games) and making daily choices to move rather than be moved (e.g., taking the stairs instead of the elevator),” the recommendations state (Circulation 2006;doi:10.1161/CIRCULATIONAHA.106.176158).
The current recommendations advise consuming lower amounts of saturated fat, from less than 10% to less than 7% of the diet. Furthermore, for the first time, the guidelines also recommend that trans fats should be limited to less than 1% of the diet.
The guidelines also suggest increasing the consumption of vegetables, fruits, and whole grain foods; eating fish at least twice a week; and minimizing the intake of high-sugar drinks and foods.
The point is not to meticulously calculate the amount and types of fats and other potentially harmful dietary components, but to more generally avoid foods made with hydrogenated fats or added salt and sugar, as well as to choose foods that minimize these components, such as leaner meats and lower-fat dairy products.
The recommendations specifically address adhering to a heart-healthy diet and restricting portion size when consuming food prepared at restaurants, grocery stores, fast-food outlets, schools, and other locales outside the home, because food eaten outside the home constitutes an estimated one-third of the calories that Americans consume, said Dr. Lichtenstein, who is also Gershoff professor of nutrition science and policy at Tufts University, Boston, and her associates.
The guidelines now include sections with practical tips for clinicians to recommend and for patients to follow. And for the first time, restaurants, the food industry, schools, and local governments are called on to take practical steps to encourage physical activity and discourage unhealthy eating.
These measures include reformulating processed foods, packaging foods in smaller portions, providing more vegetable options, and providing safe venues for walking and biking.
The American Heart Association's updated guidelines on cardiovascular health for Americans are moving beyond diet to lifestyle.
The guidelines, which were last issued in 2000, were revised after a panel of nutrition and cardiovascular disease experts reviewed the scientific literature that had been published in the intervening 6 years.
“The key message of the [updated] recommendations is to focus on long-term, permanent changes in how we eat and live. The best way to lower cardiovascular risk is to combine physical activity with heart-healthy eating habits, coupled with weight control and avoiding tobacco products,” said Dr. Alice H. Lichtenstein, who is the chair of the association's nutrition committee, in a statement accompanying release of the new guidelines.
The new recommendations stress balancing the number of calories consumed with the amount of energy expended. Thirty minutes or more of physical activity per day is recommended, even if it is broken up into small increments.
“Achieving a physically active lifestyle requires effective time management, with a particular focus on reducing sedentary activities such as screen time (e.g., watching television, surfing the Web, playing computer games) and making daily choices to move rather than be moved (e.g., taking the stairs instead of the elevator),” the recommendations state (Circulation 2006;doi:10.1161/CIRCULATIONAHA.106.176158).
The current recommendations advise consuming lower amounts of saturated fat, from less than 10% to less than 7% of the diet. Furthermore, for the first time, the guidelines also recommend that trans fats should be limited to less than 1% of the diet.
The guidelines also suggest increasing the consumption of vegetables, fruits, and whole grain foods; eating fish at least twice a week; and minimizing the intake of high-sugar drinks and foods.
The point is not to meticulously calculate the amount and types of fats and other potentially harmful dietary components, but to more generally avoid foods made with hydrogenated fats or added salt and sugar, as well as to choose foods that minimize these components, such as leaner meats and lower-fat dairy products.
The recommendations specifically address adhering to a heart-healthy diet and restricting portion size when consuming food prepared at restaurants, grocery stores, fast-food outlets, schools, and other locales outside the home, because food eaten outside the home constitutes an estimated one-third of the calories that Americans consume, said Dr. Lichtenstein, who is also Gershoff professor of nutrition science and policy at Tufts University, Boston, and her associates.
The guidelines now include sections with practical tips for clinicians to recommend and for patients to follow. And for the first time, restaurants, the food industry, schools, and local governments are called on to take practical steps to encourage physical activity and discourage unhealthy eating.
These measures include reformulating processed foods, packaging foods in smaller portions, providing more vegetable options, and providing safe venues for walking and biking.
The American Heart Association's updated guidelines on cardiovascular health for Americans are moving beyond diet to lifestyle.
The guidelines, which were last issued in 2000, were revised after a panel of nutrition and cardiovascular disease experts reviewed the scientific literature that had been published in the intervening 6 years.
“The key message of the [updated] recommendations is to focus on long-term, permanent changes in how we eat and live. The best way to lower cardiovascular risk is to combine physical activity with heart-healthy eating habits, coupled with weight control and avoiding tobacco products,” said Dr. Alice H. Lichtenstein, who is the chair of the association's nutrition committee, in a statement accompanying release of the new guidelines.
The new recommendations stress balancing the number of calories consumed with the amount of energy expended. Thirty minutes or more of physical activity per day is recommended, even if it is broken up into small increments.
“Achieving a physically active lifestyle requires effective time management, with a particular focus on reducing sedentary activities such as screen time (e.g., watching television, surfing the Web, playing computer games) and making daily choices to move rather than be moved (e.g., taking the stairs instead of the elevator),” the recommendations state (Circulation 2006;doi:10.1161/CIRCULATIONAHA.106.176158).
The current recommendations advise consuming lower amounts of saturated fat, from less than 10% to less than 7% of the diet. Furthermore, for the first time, the guidelines also recommend that trans fats should be limited to less than 1% of the diet.
The guidelines also suggest increasing the consumption of vegetables, fruits, and whole grain foods; eating fish at least twice a week; and minimizing the intake of high-sugar drinks and foods.
The point is not to meticulously calculate the amount and types of fats and other potentially harmful dietary components, but to more generally avoid foods made with hydrogenated fats or added salt and sugar, as well as to choose foods that minimize these components, such as leaner meats and lower-fat dairy products.
The recommendations specifically address adhering to a heart-healthy diet and restricting portion size when consuming food prepared at restaurants, grocery stores, fast-food outlets, schools, and other locales outside the home, because food eaten outside the home constitutes an estimated one-third of the calories that Americans consume, said Dr. Lichtenstein, who is also Gershoff professor of nutrition science and policy at Tufts University, Boston, and her associates.
The guidelines now include sections with practical tips for clinicians to recommend and for patients to follow. And for the first time, restaurants, the food industry, schools, and local governments are called on to take practical steps to encourage physical activity and discourage unhealthy eating.
These measures include reformulating processed foods, packaging foods in smaller portions, providing more vegetable options, and providing safe venues for walking and biking.