User login
Waist Circumference Raised Mortality Regardless of BMI
Greater waist circumference was associated with a higher mortality risk independent of body mass index, according to a report in the August 9/23 issue of the Archives of Internal Medicine.
Waist circumference is positively associated with mortality risk within all categories of BMI – normal, overweight, and obese. In fact, the relationship between greater waist circumference and higher mortality is strongest among women with a normal BMI, said Eric J. Jacobs, Ph.D., and his associates in the epidemiology research program at the American Cancer Society, Atlanta.
The link between waist circumference and mortality has been reported in numerous studies, but this is the first study to examine that association within the standard categories of BMI, they noted (Arch. Intern. Med. 2010;170:1293-1301).
Dr. Jacobs and his colleagues used data from the Cancer Prevention Study II Nutrition Cohort, a large prospective study that obtained demographic, medical, and behavioral factors by self-administered questionnaire. They reviewed the findings on 48,500 men and 56,343 women aged 50 years and older in the 1990s who were followed through 2006. Almost all of the study subjects were white. The median baseline age was 67 years for women and 69 years for men.
The 14,647 deaths during follow-up included 5,410 cancer deaths, 4,942 cardiovascular deaths, 1,189 deaths resulting from respiratory disorders, and 3,106 deaths from all other causes.
Any waist circumference greater than the smallest sizes (less than 90 cm in men or less than 75 cm in women) was associated with higher mortality. The mortality risk rose linearly with increasing waist circumference both in men and women.
Waist circumference was positively related to mortality in all patients. For men, the relative risk of mortality rose 16% (with normal BMI), 18% (overweight), and 21% (obese) with every 10-cm increase in waist size. For women, the relative risks with every 10-cm increase were greater for those with normal BMIs: 25% (normal), 15% (overweight), and 13% (obese).
When analyzed by cause of death, the link between waist circumference and mortality was strongest with death from respiratory causes, followed by cardiovascular disease and then cancer, Dr. Jacobs and his associates reported.
The investigators found no significant interactions between BMI-adjusted waist circumference and diabetes, smoking, or follow-up time. In addition, the researchers saw no interaction between waist circumference and hormone therapy in women. Relative mortality risks associated with waist circumference appeared to be greater in men who were less physically active.
“Results from this large prospective study emphasize the importance of waist circumference as a risk factor for mortality in older adults, regardless of whether the BMI is categorized as normal, overweight, or obese. Our results suggest that, regardless of weight, avoiding gains in waist circumference may reduce risk of premature mortality,” the researchers said.
The findings are important in light of the fact that current clinical guidelines do not address waist circumference in normal-weight patients and do not recommend weight loss for abdominally obese patients unless they have a high BMI, Dr. Jacobs and his associates noted.
The study was limited in that it included participants who were at least 50 years old, and it included very few nonwhite subjects. Thus, “results may not be generalizable to younger populations or those of other racial or ethnic backgrounds,” the authors noted.
No financial disclosures were reported.
Greater waist circumference is associated with higher mortality risk regardless of BMI. The association is strongest with death from respiratory causes, followed by cardiovascular disease, and then cancer. (Photo: Copyright Keith Frith/FOTOLIA)
Greater waist circumference was associated with a higher mortality risk independent of body mass index, according to a report in the August 9/23 issue of the Archives of Internal Medicine.
Waist circumference is positively associated with mortality risk within all categories of BMI – normal, overweight, and obese. In fact, the relationship between greater waist circumference and higher mortality is strongest among women with a normal BMI, said Eric J. Jacobs, Ph.D., and his associates in the epidemiology research program at the American Cancer Society, Atlanta.
The link between waist circumference and mortality has been reported in numerous studies, but this is the first study to examine that association within the standard categories of BMI, they noted (Arch. Intern. Med. 2010;170:1293-1301).
Dr. Jacobs and his colleagues used data from the Cancer Prevention Study II Nutrition Cohort, a large prospective study that obtained demographic, medical, and behavioral factors by self-administered questionnaire. They reviewed the findings on 48,500 men and 56,343 women aged 50 years and older in the 1990s who were followed through 2006. Almost all of the study subjects were white. The median baseline age was 67 years for women and 69 years for men.
The 14,647 deaths during follow-up included 5,410 cancer deaths, 4,942 cardiovascular deaths, 1,189 deaths resulting from respiratory disorders, and 3,106 deaths from all other causes.
Any waist circumference greater than the smallest sizes (less than 90 cm in men or less than 75 cm in women) was associated with higher mortality. The mortality risk rose linearly with increasing waist circumference both in men and women.
Waist circumference was positively related to mortality in all patients. For men, the relative risk of mortality rose 16% (with normal BMI), 18% (overweight), and 21% (obese) with every 10-cm increase in waist size. For women, the relative risks with every 10-cm increase were greater for those with normal BMIs: 25% (normal), 15% (overweight), and 13% (obese).
When analyzed by cause of death, the link between waist circumference and mortality was strongest with death from respiratory causes, followed by cardiovascular disease and then cancer, Dr. Jacobs and his associates reported.
The investigators found no significant interactions between BMI-adjusted waist circumference and diabetes, smoking, or follow-up time. In addition, the researchers saw no interaction between waist circumference and hormone therapy in women. Relative mortality risks associated with waist circumference appeared to be greater in men who were less physically active.
“Results from this large prospective study emphasize the importance of waist circumference as a risk factor for mortality in older adults, regardless of whether the BMI is categorized as normal, overweight, or obese. Our results suggest that, regardless of weight, avoiding gains in waist circumference may reduce risk of premature mortality,” the researchers said.
The findings are important in light of the fact that current clinical guidelines do not address waist circumference in normal-weight patients and do not recommend weight loss for abdominally obese patients unless they have a high BMI, Dr. Jacobs and his associates noted.
The study was limited in that it included participants who were at least 50 years old, and it included very few nonwhite subjects. Thus, “results may not be generalizable to younger populations or those of other racial or ethnic backgrounds,” the authors noted.
No financial disclosures were reported.
Greater waist circumference is associated with higher mortality risk regardless of BMI. The association is strongest with death from respiratory causes, followed by cardiovascular disease, and then cancer. (Photo: Copyright Keith Frith/FOTOLIA)
Greater waist circumference was associated with a higher mortality risk independent of body mass index, according to a report in the August 9/23 issue of the Archives of Internal Medicine.
Waist circumference is positively associated with mortality risk within all categories of BMI – normal, overweight, and obese. In fact, the relationship between greater waist circumference and higher mortality is strongest among women with a normal BMI, said Eric J. Jacobs, Ph.D., and his associates in the epidemiology research program at the American Cancer Society, Atlanta.
The link between waist circumference and mortality has been reported in numerous studies, but this is the first study to examine that association within the standard categories of BMI, they noted (Arch. Intern. Med. 2010;170:1293-1301).
Dr. Jacobs and his colleagues used data from the Cancer Prevention Study II Nutrition Cohort, a large prospective study that obtained demographic, medical, and behavioral factors by self-administered questionnaire. They reviewed the findings on 48,500 men and 56,343 women aged 50 years and older in the 1990s who were followed through 2006. Almost all of the study subjects were white. The median baseline age was 67 years for women and 69 years for men.
The 14,647 deaths during follow-up included 5,410 cancer deaths, 4,942 cardiovascular deaths, 1,189 deaths resulting from respiratory disorders, and 3,106 deaths from all other causes.
Any waist circumference greater than the smallest sizes (less than 90 cm in men or less than 75 cm in women) was associated with higher mortality. The mortality risk rose linearly with increasing waist circumference both in men and women.
Waist circumference was positively related to mortality in all patients. For men, the relative risk of mortality rose 16% (with normal BMI), 18% (overweight), and 21% (obese) with every 10-cm increase in waist size. For women, the relative risks with every 10-cm increase were greater for those with normal BMIs: 25% (normal), 15% (overweight), and 13% (obese).
When analyzed by cause of death, the link between waist circumference and mortality was strongest with death from respiratory causes, followed by cardiovascular disease and then cancer, Dr. Jacobs and his associates reported.
The investigators found no significant interactions between BMI-adjusted waist circumference and diabetes, smoking, or follow-up time. In addition, the researchers saw no interaction between waist circumference and hormone therapy in women. Relative mortality risks associated with waist circumference appeared to be greater in men who were less physically active.
“Results from this large prospective study emphasize the importance of waist circumference as a risk factor for mortality in older adults, regardless of whether the BMI is categorized as normal, overweight, or obese. Our results suggest that, regardless of weight, avoiding gains in waist circumference may reduce risk of premature mortality,” the researchers said.
The findings are important in light of the fact that current clinical guidelines do not address waist circumference in normal-weight patients and do not recommend weight loss for abdominally obese patients unless they have a high BMI, Dr. Jacobs and his associates noted.
The study was limited in that it included participants who were at least 50 years old, and it included very few nonwhite subjects. Thus, “results may not be generalizable to younger populations or those of other racial or ethnic backgrounds,” the authors noted.
No financial disclosures were reported.
Greater waist circumference is associated with higher mortality risk regardless of BMI. The association is strongest with death from respiratory causes, followed by cardiovascular disease, and then cancer. (Photo: Copyright Keith Frith/FOTOLIA)
Bone Marrow Transplant Offers Hope in Recessive Dystrophic EB
Transplantation of donor bone marrow or umbilical cord blood partially corrected the collagen deficiency in five of seven children with recessive dystrophic epidermolysis bullosa who underwent the experimental therapy, greatly ameliorating their severe symptoms by improving their skin and mucosal integrity, according to a recent report in the Aug. 12 issue of the New England Journal of Medicine.
In the phase II clinical trial, six of the patients were alive at 130-799 days after the procedure; their rates of recovery and ultimate outcomes varied. Two showed rapid and dramatic clinical improvement in wound healing and mucocutaneous blistering, two showed marked improvement, and one showed slow, modest improvement. Another patient had a recurrence of blistering after 2 months of substantial improvement, said Dr. John E. Wagner of the University of Minnesota Health Center, Minneapolis, and his associates.
One of the patients who responded to the transplantation died from opportunistic infections on day 183. And one of the patients died before bone marrow infusion could be performed, from complications of the conditioning immunomyeloablative chemotherapy.
Epidermolysis bullosa (EB) refers to a group of inherited skin diseases characterized by painful erosions and blisters on skin and mucosal membranes, induced by mild trauma. Recessive dystrophic EB is one of the most severe forms of the disease, present at birth and often eventually resulting in esophageal strictures, mutilating scars, local and systemic infections, joint contractures, fusion of fingers and toes, and aggressive squamous-cell carcinomas. Median survival is only 30 years.
EB is caused by loss-of-function mutations in the gene that encodes for collagen type VII (C7). The mutations cause a severe decrease in the expression of C7, "a collagen localized at the dermal-epidermal junction" that contributes to the formation of "anchoring fibrils that tether the epidermal basement membrane to the dermal matrix," noted the investigators. When C7 is not expressed properly, these fibrils fail to form properly, "and epidermal-dermal adherence is lost beneath the lamina densa of the basement membrane," Dr. Wagner and his colleagues explained.
"To date, the care of patients with recessive dystrophic EB has been palliative and restricted to the treatment of individual wounds," they added.
The investigators explored whether bone marrow or cord blood transplantation might correct the collagen mutations systemically in a murine model. When that proved successful, they undertook the clinical trial in the seven pediatric patients (aged 15 months to 14.5 years). All had extensive cutaneous disease and four had severe mucosal disease requiring esophageal dilation and placement of a gastrostomy tube for nutritional support. Five patients had severe mitten deformities, four required wheelchairs, two had renal impairment, and four had severe iron-deficiency anemia.
One patient died before transplantation from hemorrhagic cardiomyopathy "that was probably due to cyclophosphamide cardiotoxicity," and a second had to delay transplantation until different complications from the conditioning chemotherapy resolved. At transplantation, five patients received unfiltered marrow stem cells from a human leukocyte antigen – identical but healthy sibling, and one of these five also received umbilical cord blood from the same donor. A sixth patient received umbilical cord blood from an unrelated donor.
All six patients showed improved wound healing and decreased mucocutaneous blistering within the first 100 days, with three showing marked improvement within 30 days. The percentage of affected body surface area was reduced significantly in three patients, as assessed by clinician and parent reports and by documented reductions in the need for bandages.
Four patients were able to discontinue all immunosuppressive therapy and the fifth has tolerated tapering of cyclosporine, according to the investigators.
Skin biopsy specimens showed increases in C7 immunoreactivity at the dermal-epidermal junction after transplantation in all six patients. Testing with an anti-C7 antibody showed an increase in C7 expression over time in five of the six.
At baseline, electron micrographs had shown "a complete absence of mature anchoring fibrils." After transplantation, five of the six patients showed "scanty, wispy structures under the lamina densa," which could represent rudimentary anchoring fibrils or fragmented elastic fibrils. More study is needed to further characterize these structures. None of the micrographs showed the morphologic hallmarks of normal anchoring fibrils, the investigators reported (N. Engl. J. Med. 2010;363:629-39).
"Unexpectedly, we detected substantial proportions of donor cells in the skin and mucosa after treatment; these proportions varied over time and with the location of the biopsy site," they wrote.
Although the precise identity and function of these donor cells has yet to be determined, "we favor the possibility that these healthy donor cells residing in the skin secrete C7 and that the secreted C7 is subsequently incorporated into the lamina densa at the dermal-epidermal junction," Dr. Wagner and his colleagues noted,
"Substantial efforts are under way to understand the physiology of the apparent clinical response after bone marrow transplantation and to identify the stem-cell population responsible for this effect," they wrote.
Until this trial was performed, it was not known whether patients with preexisting mucocutaneous disease could tolerate the conditioning regimens used to prepare for allogeneic bone marrow transplantation. In particular, mucositis was feared because it is a common side effect even in patients without mucocutaneous disease. "The unique skin and mucosal membrane defects of this disease pose a particular challenge to any bone marrow transplantation program," the investigators added.
Yet only one of the six patients developed severe cutaneous toxicity. All patients developed mucositis, but the condition responded to therapy. "Notably, no patient had uncontrolled cellulitis, despite pretransplantation bacterial or fungal skin colonization," they noted.
Other adverse events included transient hyperbilirubinemia (four patients) and renal insufficiency requiring 3 days of hemodialysis (two patients). No patient developed acute or chronic graft-vs.-host disease.
"Clearly, much remains to be learned regarding the mechanism of the apparent functional correction as well as the long-term risks and benefits of this therapeutic approach, including the risk of squamous-cell carcinoma, which may occur after chemotherapy or as a result of incomplete correction of the underlying disease," Dr. Wagner and his colleagues noted.
"Already, we and others are considering modifications to enhance safety, such as coinfusion of mesenchymal stromal cells or the use of reduced-dose conditioning before bone marrow transplantation," they added.
This study was supported by the University of Minnesota Academic Health Center; the National institutes of Health; the Ministry of Health, Labor, and Welfare of Japan; the Ministry of Education, Culture, Sports, Science, and Technology of Japan; the Epidermolysis Bullosa (Liao Family) Research Fund; the Sarah Rose Mooreland EB Fund; and the Children’s Cancer Research Fund. One of Dr. Wagner’s associates reported previous ties to Johnson & Johnson, Procter & Gamble, Novartis, Astellas, and Allergan.
Despite the still unresolved clinical and scientific issues with this experimental therapy, the systemic approach to this genetic skin disease "represents a leap forward," wrote Dr. Leena Bruckner-Tuderman in an accompanying editorial (N. Engl. J. Med. 2010;363:680-2).
The study by Dr. Wagner and his colleagues "gives cautious hope that effective therapy of recessive dystrophic EB and other genetic skin diseases may one day be available," noted Dr. Bruckner-Tuderman, is in the department of dermatology at the University of Freiburg (Germany) Medical Center.
Future research should focus on the extent and duration of the therapeutic effects, particularly on whether higher C7 levels in the skin and improvements in mucocutaneous integrity can be sustained over the long term. In addition, more objective methods to assess treatment response are needed, as parental and clinical observation can be "quite subjective."
Dr. Wagner and his colleagues demonstrated that some patients with mucocutaneous fragility can tolerate the conditioning regimen and other procedures and medications needed for bone marrow transplantation. But they also showed that some cannot, and that the life-threatening adverse effects must be weighed carefully against potential benefits.
In their report, the investigators did not specifically address the issue of subject age in this clinical trial. Although some may consider it questionable to test an experimental treatment in children, it was important in this instance to conduct the test among patients in optimal clinical condition.
In recessive dystrophic EB, the severe secondary symptoms accrue with time, so it is reasonable to perform the transplantation as early as possible, "with the aim of preventing severe scarring, deformities, and also, ultimately squamous-cell carcinomas."
She reported no financial disclosures.
Transplantation of donor bone marrow or umbilical cord blood partially corrected the collagen deficiency in five of seven children with recessive dystrophic epidermolysis bullosa who underwent the experimental therapy, greatly ameliorating their severe symptoms by improving their skin and mucosal integrity, according to a recent report in the Aug. 12 issue of the New England Journal of Medicine.
In the phase II clinical trial, six of the patients were alive at 130-799 days after the procedure; their rates of recovery and ultimate outcomes varied. Two showed rapid and dramatic clinical improvement in wound healing and mucocutaneous blistering, two showed marked improvement, and one showed slow, modest improvement. Another patient had a recurrence of blistering after 2 months of substantial improvement, said Dr. John E. Wagner of the University of Minnesota Health Center, Minneapolis, and his associates.
One of the patients who responded to the transplantation died from opportunistic infections on day 183. And one of the patients died before bone marrow infusion could be performed, from complications of the conditioning immunomyeloablative chemotherapy.
Epidermolysis bullosa (EB) refers to a group of inherited skin diseases characterized by painful erosions and blisters on skin and mucosal membranes, induced by mild trauma. Recessive dystrophic EB is one of the most severe forms of the disease, present at birth and often eventually resulting in esophageal strictures, mutilating scars, local and systemic infections, joint contractures, fusion of fingers and toes, and aggressive squamous-cell carcinomas. Median survival is only 30 years.
EB is caused by loss-of-function mutations in the gene that encodes for collagen type VII (C7). The mutations cause a severe decrease in the expression of C7, "a collagen localized at the dermal-epidermal junction" that contributes to the formation of "anchoring fibrils that tether the epidermal basement membrane to the dermal matrix," noted the investigators. When C7 is not expressed properly, these fibrils fail to form properly, "and epidermal-dermal adherence is lost beneath the lamina densa of the basement membrane," Dr. Wagner and his colleagues explained.
"To date, the care of patients with recessive dystrophic EB has been palliative and restricted to the treatment of individual wounds," they added.
The investigators explored whether bone marrow or cord blood transplantation might correct the collagen mutations systemically in a murine model. When that proved successful, they undertook the clinical trial in the seven pediatric patients (aged 15 months to 14.5 years). All had extensive cutaneous disease and four had severe mucosal disease requiring esophageal dilation and placement of a gastrostomy tube for nutritional support. Five patients had severe mitten deformities, four required wheelchairs, two had renal impairment, and four had severe iron-deficiency anemia.
One patient died before transplantation from hemorrhagic cardiomyopathy "that was probably due to cyclophosphamide cardiotoxicity," and a second had to delay transplantation until different complications from the conditioning chemotherapy resolved. At transplantation, five patients received unfiltered marrow stem cells from a human leukocyte antigen – identical but healthy sibling, and one of these five also received umbilical cord blood from the same donor. A sixth patient received umbilical cord blood from an unrelated donor.
All six patients showed improved wound healing and decreased mucocutaneous blistering within the first 100 days, with three showing marked improvement within 30 days. The percentage of affected body surface area was reduced significantly in three patients, as assessed by clinician and parent reports and by documented reductions in the need for bandages.
Four patients were able to discontinue all immunosuppressive therapy and the fifth has tolerated tapering of cyclosporine, according to the investigators.
Skin biopsy specimens showed increases in C7 immunoreactivity at the dermal-epidermal junction after transplantation in all six patients. Testing with an anti-C7 antibody showed an increase in C7 expression over time in five of the six.
At baseline, electron micrographs had shown "a complete absence of mature anchoring fibrils." After transplantation, five of the six patients showed "scanty, wispy structures under the lamina densa," which could represent rudimentary anchoring fibrils or fragmented elastic fibrils. More study is needed to further characterize these structures. None of the micrographs showed the morphologic hallmarks of normal anchoring fibrils, the investigators reported (N. Engl. J. Med. 2010;363:629-39).
"Unexpectedly, we detected substantial proportions of donor cells in the skin and mucosa after treatment; these proportions varied over time and with the location of the biopsy site," they wrote.
Although the precise identity and function of these donor cells has yet to be determined, "we favor the possibility that these healthy donor cells residing in the skin secrete C7 and that the secreted C7 is subsequently incorporated into the lamina densa at the dermal-epidermal junction," Dr. Wagner and his colleagues noted,
"Substantial efforts are under way to understand the physiology of the apparent clinical response after bone marrow transplantation and to identify the stem-cell population responsible for this effect," they wrote.
Until this trial was performed, it was not known whether patients with preexisting mucocutaneous disease could tolerate the conditioning regimens used to prepare for allogeneic bone marrow transplantation. In particular, mucositis was feared because it is a common side effect even in patients without mucocutaneous disease. "The unique skin and mucosal membrane defects of this disease pose a particular challenge to any bone marrow transplantation program," the investigators added.
Yet only one of the six patients developed severe cutaneous toxicity. All patients developed mucositis, but the condition responded to therapy. "Notably, no patient had uncontrolled cellulitis, despite pretransplantation bacterial or fungal skin colonization," they noted.
Other adverse events included transient hyperbilirubinemia (four patients) and renal insufficiency requiring 3 days of hemodialysis (two patients). No patient developed acute or chronic graft-vs.-host disease.
"Clearly, much remains to be learned regarding the mechanism of the apparent functional correction as well as the long-term risks and benefits of this therapeutic approach, including the risk of squamous-cell carcinoma, which may occur after chemotherapy or as a result of incomplete correction of the underlying disease," Dr. Wagner and his colleagues noted.
"Already, we and others are considering modifications to enhance safety, such as coinfusion of mesenchymal stromal cells or the use of reduced-dose conditioning before bone marrow transplantation," they added.
This study was supported by the University of Minnesota Academic Health Center; the National institutes of Health; the Ministry of Health, Labor, and Welfare of Japan; the Ministry of Education, Culture, Sports, Science, and Technology of Japan; the Epidermolysis Bullosa (Liao Family) Research Fund; the Sarah Rose Mooreland EB Fund; and the Children’s Cancer Research Fund. One of Dr. Wagner’s associates reported previous ties to Johnson & Johnson, Procter & Gamble, Novartis, Astellas, and Allergan.
Despite the still unresolved clinical and scientific issues with this experimental therapy, the systemic approach to this genetic skin disease "represents a leap forward," wrote Dr. Leena Bruckner-Tuderman in an accompanying editorial (N. Engl. J. Med. 2010;363:680-2).
The study by Dr. Wagner and his colleagues "gives cautious hope that effective therapy of recessive dystrophic EB and other genetic skin diseases may one day be available," noted Dr. Bruckner-Tuderman, is in the department of dermatology at the University of Freiburg (Germany) Medical Center.
Future research should focus on the extent and duration of the therapeutic effects, particularly on whether higher C7 levels in the skin and improvements in mucocutaneous integrity can be sustained over the long term. In addition, more objective methods to assess treatment response are needed, as parental and clinical observation can be "quite subjective."
Dr. Wagner and his colleagues demonstrated that some patients with mucocutaneous fragility can tolerate the conditioning regimen and other procedures and medications needed for bone marrow transplantation. But they also showed that some cannot, and that the life-threatening adverse effects must be weighed carefully against potential benefits.
In their report, the investigators did not specifically address the issue of subject age in this clinical trial. Although some may consider it questionable to test an experimental treatment in children, it was important in this instance to conduct the test among patients in optimal clinical condition.
In recessive dystrophic EB, the severe secondary symptoms accrue with time, so it is reasonable to perform the transplantation as early as possible, "with the aim of preventing severe scarring, deformities, and also, ultimately squamous-cell carcinomas."
She reported no financial disclosures.
Transplantation of donor bone marrow or umbilical cord blood partially corrected the collagen deficiency in five of seven children with recessive dystrophic epidermolysis bullosa who underwent the experimental therapy, greatly ameliorating their severe symptoms by improving their skin and mucosal integrity, according to a recent report in the Aug. 12 issue of the New England Journal of Medicine.
In the phase II clinical trial, six of the patients were alive at 130-799 days after the procedure; their rates of recovery and ultimate outcomes varied. Two showed rapid and dramatic clinical improvement in wound healing and mucocutaneous blistering, two showed marked improvement, and one showed slow, modest improvement. Another patient had a recurrence of blistering after 2 months of substantial improvement, said Dr. John E. Wagner of the University of Minnesota Health Center, Minneapolis, and his associates.
One of the patients who responded to the transplantation died from opportunistic infections on day 183. And one of the patients died before bone marrow infusion could be performed, from complications of the conditioning immunomyeloablative chemotherapy.
Epidermolysis bullosa (EB) refers to a group of inherited skin diseases characterized by painful erosions and blisters on skin and mucosal membranes, induced by mild trauma. Recessive dystrophic EB is one of the most severe forms of the disease, present at birth and often eventually resulting in esophageal strictures, mutilating scars, local and systemic infections, joint contractures, fusion of fingers and toes, and aggressive squamous-cell carcinomas. Median survival is only 30 years.
EB is caused by loss-of-function mutations in the gene that encodes for collagen type VII (C7). The mutations cause a severe decrease in the expression of C7, "a collagen localized at the dermal-epidermal junction" that contributes to the formation of "anchoring fibrils that tether the epidermal basement membrane to the dermal matrix," noted the investigators. When C7 is not expressed properly, these fibrils fail to form properly, "and epidermal-dermal adherence is lost beneath the lamina densa of the basement membrane," Dr. Wagner and his colleagues explained.
"To date, the care of patients with recessive dystrophic EB has been palliative and restricted to the treatment of individual wounds," they added.
The investigators explored whether bone marrow or cord blood transplantation might correct the collagen mutations systemically in a murine model. When that proved successful, they undertook the clinical trial in the seven pediatric patients (aged 15 months to 14.5 years). All had extensive cutaneous disease and four had severe mucosal disease requiring esophageal dilation and placement of a gastrostomy tube for nutritional support. Five patients had severe mitten deformities, four required wheelchairs, two had renal impairment, and four had severe iron-deficiency anemia.
One patient died before transplantation from hemorrhagic cardiomyopathy "that was probably due to cyclophosphamide cardiotoxicity," and a second had to delay transplantation until different complications from the conditioning chemotherapy resolved. At transplantation, five patients received unfiltered marrow stem cells from a human leukocyte antigen – identical but healthy sibling, and one of these five also received umbilical cord blood from the same donor. A sixth patient received umbilical cord blood from an unrelated donor.
All six patients showed improved wound healing and decreased mucocutaneous blistering within the first 100 days, with three showing marked improvement within 30 days. The percentage of affected body surface area was reduced significantly in three patients, as assessed by clinician and parent reports and by documented reductions in the need for bandages.
Four patients were able to discontinue all immunosuppressive therapy and the fifth has tolerated tapering of cyclosporine, according to the investigators.
Skin biopsy specimens showed increases in C7 immunoreactivity at the dermal-epidermal junction after transplantation in all six patients. Testing with an anti-C7 antibody showed an increase in C7 expression over time in five of the six.
At baseline, electron micrographs had shown "a complete absence of mature anchoring fibrils." After transplantation, five of the six patients showed "scanty, wispy structures under the lamina densa," which could represent rudimentary anchoring fibrils or fragmented elastic fibrils. More study is needed to further characterize these structures. None of the micrographs showed the morphologic hallmarks of normal anchoring fibrils, the investigators reported (N. Engl. J. Med. 2010;363:629-39).
"Unexpectedly, we detected substantial proportions of donor cells in the skin and mucosa after treatment; these proportions varied over time and with the location of the biopsy site," they wrote.
Although the precise identity and function of these donor cells has yet to be determined, "we favor the possibility that these healthy donor cells residing in the skin secrete C7 and that the secreted C7 is subsequently incorporated into the lamina densa at the dermal-epidermal junction," Dr. Wagner and his colleagues noted,
"Substantial efforts are under way to understand the physiology of the apparent clinical response after bone marrow transplantation and to identify the stem-cell population responsible for this effect," they wrote.
Until this trial was performed, it was not known whether patients with preexisting mucocutaneous disease could tolerate the conditioning regimens used to prepare for allogeneic bone marrow transplantation. In particular, mucositis was feared because it is a common side effect even in patients without mucocutaneous disease. "The unique skin and mucosal membrane defects of this disease pose a particular challenge to any bone marrow transplantation program," the investigators added.
Yet only one of the six patients developed severe cutaneous toxicity. All patients developed mucositis, but the condition responded to therapy. "Notably, no patient had uncontrolled cellulitis, despite pretransplantation bacterial or fungal skin colonization," they noted.
Other adverse events included transient hyperbilirubinemia (four patients) and renal insufficiency requiring 3 days of hemodialysis (two patients). No patient developed acute or chronic graft-vs.-host disease.
"Clearly, much remains to be learned regarding the mechanism of the apparent functional correction as well as the long-term risks and benefits of this therapeutic approach, including the risk of squamous-cell carcinoma, which may occur after chemotherapy or as a result of incomplete correction of the underlying disease," Dr. Wagner and his colleagues noted.
"Already, we and others are considering modifications to enhance safety, such as coinfusion of mesenchymal stromal cells or the use of reduced-dose conditioning before bone marrow transplantation," they added.
This study was supported by the University of Minnesota Academic Health Center; the National institutes of Health; the Ministry of Health, Labor, and Welfare of Japan; the Ministry of Education, Culture, Sports, Science, and Technology of Japan; the Epidermolysis Bullosa (Liao Family) Research Fund; the Sarah Rose Mooreland EB Fund; and the Children’s Cancer Research Fund. One of Dr. Wagner’s associates reported previous ties to Johnson & Johnson, Procter & Gamble, Novartis, Astellas, and Allergan.
Despite the still unresolved clinical and scientific issues with this experimental therapy, the systemic approach to this genetic skin disease "represents a leap forward," wrote Dr. Leena Bruckner-Tuderman in an accompanying editorial (N. Engl. J. Med. 2010;363:680-2).
The study by Dr. Wagner and his colleagues "gives cautious hope that effective therapy of recessive dystrophic EB and other genetic skin diseases may one day be available," noted Dr. Bruckner-Tuderman, is in the department of dermatology at the University of Freiburg (Germany) Medical Center.
Future research should focus on the extent and duration of the therapeutic effects, particularly on whether higher C7 levels in the skin and improvements in mucocutaneous integrity can be sustained over the long term. In addition, more objective methods to assess treatment response are needed, as parental and clinical observation can be "quite subjective."
Dr. Wagner and his colleagues demonstrated that some patients with mucocutaneous fragility can tolerate the conditioning regimen and other procedures and medications needed for bone marrow transplantation. But they also showed that some cannot, and that the life-threatening adverse effects must be weighed carefully against potential benefits.
In their report, the investigators did not specifically address the issue of subject age in this clinical trial. Although some may consider it questionable to test an experimental treatment in children, it was important in this instance to conduct the test among patients in optimal clinical condition.
In recessive dystrophic EB, the severe secondary symptoms accrue with time, so it is reasonable to perform the transplantation as early as possible, "with the aim of preventing severe scarring, deformities, and also, ultimately squamous-cell carcinomas."
She reported no financial disclosures.
Obesity at Age 18 Found to Increase PsA Risk
People who are obese at age 18 may be at an increased risk for psoriatic arthritis later in life, according to a new report in the July issue of the Archives of Dermatology.
In a single-center study of 943 psoriasis patients, those who reported being obese at age 18 were three times more likely to develop psoriatic arthritis (PsA), compared with patients who reported having a normal body mass index at age 18, reported Dr. Razieh Soltani-Arabshahi and associates of the University of Utah School of Medicine, Salt Lake City.
In a previous study, the researchers found that patients with psoriasis had an increased BMI, compared with controls. So, they “set out to study if obesity increases the risk of PsA,” using data from a large cohort of subjects enrolled in the Utah Psoriasis Initiative, the researchers noted.
The cohort included consecutive patients older than 18 years who attended university-affiliated psoriasis clinics in 2002-2008 and provided detailed demographic and clinical data.
A total of 250 (27%) of the 943 subjects included in the study reported having PsA.
Of the study patients, 14% had been overweight and 5% had been obese at age 18, according to self-reported height and weight measurements.
Higher BMI was associated with an increased risk of developing PsA, independent of other risk factors such as nail involvement. Each unit increase in BMI at age 18 corresponded to a 5% increase in risk of PsA.
In addition, patients who were obese at age 18 showed an earlier onset of PsA, compared with patients of normal weight at age 18. Twenty percent of those who had been overweight or obese at 18 years developed PsA by age 35. In comparison, among patients of normal weight at age 18, 20% did not develop PsA until age 48.
Moreover, patients who had been overweight or obese at age 18 were more likely to report having severe psoriasis (47% and 57%, respectively) than patients who were of normal weight at age 18 (39%).
The design of the study did not permit the investigators to infer causality. However, it is plausible that obesity and its associated inflammatory state might contribute to both psoriasis and PsA, Dr. Soltani-Arabshahi and colleagues reported (Arch. Dermatol. 2010;146:721-6).
“Evaluation of additional sample sets in an attempt to replicate these results is imperative for strong conclusions to be drawn,” they noted.
The study was limited in that it relied on subjects' self-report of height and weight earlier in life, self-report of psoriasis severity, and self-report of diagnosis of PsA.
Disclosures: The study was supported in part by the Utah Psoriasis Initiative and the Benning Foundation. Dr. Soltani-Arabshahi's associates reported numerous industry relationships.
People who are obese at age 18 may be at an increased risk for psoriatic arthritis later in life, according to a new report in the July issue of the Archives of Dermatology.
In a single-center study of 943 psoriasis patients, those who reported being obese at age 18 were three times more likely to develop psoriatic arthritis (PsA), compared with patients who reported having a normal body mass index at age 18, reported Dr. Razieh Soltani-Arabshahi and associates of the University of Utah School of Medicine, Salt Lake City.
In a previous study, the researchers found that patients with psoriasis had an increased BMI, compared with controls. So, they “set out to study if obesity increases the risk of PsA,” using data from a large cohort of subjects enrolled in the Utah Psoriasis Initiative, the researchers noted.
The cohort included consecutive patients older than 18 years who attended university-affiliated psoriasis clinics in 2002-2008 and provided detailed demographic and clinical data.
A total of 250 (27%) of the 943 subjects included in the study reported having PsA.
Of the study patients, 14% had been overweight and 5% had been obese at age 18, according to self-reported height and weight measurements.
Higher BMI was associated with an increased risk of developing PsA, independent of other risk factors such as nail involvement. Each unit increase in BMI at age 18 corresponded to a 5% increase in risk of PsA.
In addition, patients who were obese at age 18 showed an earlier onset of PsA, compared with patients of normal weight at age 18. Twenty percent of those who had been overweight or obese at 18 years developed PsA by age 35. In comparison, among patients of normal weight at age 18, 20% did not develop PsA until age 48.
Moreover, patients who had been overweight or obese at age 18 were more likely to report having severe psoriasis (47% and 57%, respectively) than patients who were of normal weight at age 18 (39%).
The design of the study did not permit the investigators to infer causality. However, it is plausible that obesity and its associated inflammatory state might contribute to both psoriasis and PsA, Dr. Soltani-Arabshahi and colleagues reported (Arch. Dermatol. 2010;146:721-6).
“Evaluation of additional sample sets in an attempt to replicate these results is imperative for strong conclusions to be drawn,” they noted.
The study was limited in that it relied on subjects' self-report of height and weight earlier in life, self-report of psoriasis severity, and self-report of diagnosis of PsA.
Disclosures: The study was supported in part by the Utah Psoriasis Initiative and the Benning Foundation. Dr. Soltani-Arabshahi's associates reported numerous industry relationships.
People who are obese at age 18 may be at an increased risk for psoriatic arthritis later in life, according to a new report in the July issue of the Archives of Dermatology.
In a single-center study of 943 psoriasis patients, those who reported being obese at age 18 were three times more likely to develop psoriatic arthritis (PsA), compared with patients who reported having a normal body mass index at age 18, reported Dr. Razieh Soltani-Arabshahi and associates of the University of Utah School of Medicine, Salt Lake City.
In a previous study, the researchers found that patients with psoriasis had an increased BMI, compared with controls. So, they “set out to study if obesity increases the risk of PsA,” using data from a large cohort of subjects enrolled in the Utah Psoriasis Initiative, the researchers noted.
The cohort included consecutive patients older than 18 years who attended university-affiliated psoriasis clinics in 2002-2008 and provided detailed demographic and clinical data.
A total of 250 (27%) of the 943 subjects included in the study reported having PsA.
Of the study patients, 14% had been overweight and 5% had been obese at age 18, according to self-reported height and weight measurements.
Higher BMI was associated with an increased risk of developing PsA, independent of other risk factors such as nail involvement. Each unit increase in BMI at age 18 corresponded to a 5% increase in risk of PsA.
In addition, patients who were obese at age 18 showed an earlier onset of PsA, compared with patients of normal weight at age 18. Twenty percent of those who had been overweight or obese at 18 years developed PsA by age 35. In comparison, among patients of normal weight at age 18, 20% did not develop PsA until age 48.
Moreover, patients who had been overweight or obese at age 18 were more likely to report having severe psoriasis (47% and 57%, respectively) than patients who were of normal weight at age 18 (39%).
The design of the study did not permit the investigators to infer causality. However, it is plausible that obesity and its associated inflammatory state might contribute to both psoriasis and PsA, Dr. Soltani-Arabshahi and colleagues reported (Arch. Dermatol. 2010;146:721-6).
“Evaluation of additional sample sets in an attempt to replicate these results is imperative for strong conclusions to be drawn,” they noted.
The study was limited in that it relied on subjects' self-report of height and weight earlier in life, self-report of psoriasis severity, and self-report of diagnosis of PsA.
Disclosures: The study was supported in part by the Utah Psoriasis Initiative and the Benning Foundation. Dr. Soltani-Arabshahi's associates reported numerous industry relationships.
Later School Start Time Reduced Depressive Symptoms
Delaying the start of school for as little as 30 minutes not only improved several measures of sleep in adolescents at a boarding school, it also improved depressive symptoms, the motivation and alertness to learn, and even some dietary habits, a study has shown.
“The results of this study add to the growing literature supporting the potential benefits of adjusting school schedules to adolescents' sleep needs, circadian rhythm, and developmental stage and of optimizing sleep and alertness in the learning environment,” wrote Dr. Judith A. Owens of Hasbro Children's Hospital, Providence, R.I., and her associates.
They assessed the impact of delaying the school start time from 8:00 a.m. to 8:30 a.m. at a college-prep boarding and day school in Southern New England for 357 students in grades 9-12. Participating students anonymously completed the eight-page Sleep Habits Survey before (225 students) and after (201 students) a 2-month trial period in which the daily class schedule was delayed for 30 minutes (Arch. Ped. Adolesc. Med. 2010;164:608-14).
The survey covers typical sleep and wake behaviors during the preceding week, sleep- and wake-behavior problems such as difficulty falling asleep and difficulty awakening, depressed mood, and daytime sleepiness under varying conditions.
After the change in school start time, students showed a significant 45-minute increase in sleep duration on school nights. This was due to both waking later on school mornings and going to bed earlier on school nights.
The proportion of students who reported that they rarely or never got enough sleep declined significantly from 69% to 34%, as did the proportion who reported that they “never” got a good night's sleep, which dropped from 29% to 12%.
The percentage of students who got less than 7 hours of sleep on school nights decreased markedly, from 34% to 7%. The percentage who got at least 8 hours of sleep on school nights rose substantially, from 16% to 55%.
Similarly, the percentage of students who reported being bothered by feeling “too tired and unmotivated” to do schoolwork, socialize, or participate in sports much of the time decreased significantly.
Data from the school's health center supported the students' perception that they were less fatigued after school start time was delayed. Significantly more students visited the health center for fatigue-related symptoms before the intervention than afterward, while visits for other medical concerns showed no change.
Data from the school's food services department showed a substantial increase in consumption of healthier foods at breakfast, from from 35 servings per month to 83. Teachers' reports of absences and cases of tardiness at first-period classes decreased by nearly half.
Scores on a measure of depressed mood were significantly negatively correlated with sleep duration on both surveys. After school start time was delayed, the percentage of students who rated themselves as at least somewhat unhappy or depressed decreased significantly from 66% to 45%, as did the percentage who reported feeling irritated or annoyed much of the time (from 84% to 63%).
This benefit in depressive symptoms is particularly noteworthy, “given the recent concerns raised regarding the relationship between insufficient sleep and both depressive symptoms and suicidal ideation in adolescents,” Dr. Owens and her colleagues wrote.
They added that there had been considerable resistance to changing the school start time, voiced primarily by the faculty and athletic coaches. However, once the trial period concluded, “students and faculty overwhelmingly voted” to retain the later start time for the next term.
As one teacher commented, “I have found the 8:30 start to be the single most positive impact to my general quality of life at [the school] since I started 12 years ago.”
The researchers cautioned that this study was limited in that it did not include a control group and relied on retrospective subjective self-reports rather than on objective measures of sleep variables.
Disclosures: The study was sponsored by Lifespan Hospitals of Rhode Island, a not-for-profit hospital network. The investigators reported no financial conflicts of interest.
Delaying the start of school for as little as 30 minutes not only improved several measures of sleep in adolescents at a boarding school, it also improved depressive symptoms, the motivation and alertness to learn, and even some dietary habits, a study has shown.
“The results of this study add to the growing literature supporting the potential benefits of adjusting school schedules to adolescents' sleep needs, circadian rhythm, and developmental stage and of optimizing sleep and alertness in the learning environment,” wrote Dr. Judith A. Owens of Hasbro Children's Hospital, Providence, R.I., and her associates.
They assessed the impact of delaying the school start time from 8:00 a.m. to 8:30 a.m. at a college-prep boarding and day school in Southern New England for 357 students in grades 9-12. Participating students anonymously completed the eight-page Sleep Habits Survey before (225 students) and after (201 students) a 2-month trial period in which the daily class schedule was delayed for 30 minutes (Arch. Ped. Adolesc. Med. 2010;164:608-14).
The survey covers typical sleep and wake behaviors during the preceding week, sleep- and wake-behavior problems such as difficulty falling asleep and difficulty awakening, depressed mood, and daytime sleepiness under varying conditions.
After the change in school start time, students showed a significant 45-minute increase in sleep duration on school nights. This was due to both waking later on school mornings and going to bed earlier on school nights.
The proportion of students who reported that they rarely or never got enough sleep declined significantly from 69% to 34%, as did the proportion who reported that they “never” got a good night's sleep, which dropped from 29% to 12%.
The percentage of students who got less than 7 hours of sleep on school nights decreased markedly, from 34% to 7%. The percentage who got at least 8 hours of sleep on school nights rose substantially, from 16% to 55%.
Similarly, the percentage of students who reported being bothered by feeling “too tired and unmotivated” to do schoolwork, socialize, or participate in sports much of the time decreased significantly.
Data from the school's health center supported the students' perception that they were less fatigued after school start time was delayed. Significantly more students visited the health center for fatigue-related symptoms before the intervention than afterward, while visits for other medical concerns showed no change.
Data from the school's food services department showed a substantial increase in consumption of healthier foods at breakfast, from from 35 servings per month to 83. Teachers' reports of absences and cases of tardiness at first-period classes decreased by nearly half.
Scores on a measure of depressed mood were significantly negatively correlated with sleep duration on both surveys. After school start time was delayed, the percentage of students who rated themselves as at least somewhat unhappy or depressed decreased significantly from 66% to 45%, as did the percentage who reported feeling irritated or annoyed much of the time (from 84% to 63%).
This benefit in depressive symptoms is particularly noteworthy, “given the recent concerns raised regarding the relationship between insufficient sleep and both depressive symptoms and suicidal ideation in adolescents,” Dr. Owens and her colleagues wrote.
They added that there had been considerable resistance to changing the school start time, voiced primarily by the faculty and athletic coaches. However, once the trial period concluded, “students and faculty overwhelmingly voted” to retain the later start time for the next term.
As one teacher commented, “I have found the 8:30 start to be the single most positive impact to my general quality of life at [the school] since I started 12 years ago.”
The researchers cautioned that this study was limited in that it did not include a control group and relied on retrospective subjective self-reports rather than on objective measures of sleep variables.
Disclosures: The study was sponsored by Lifespan Hospitals of Rhode Island, a not-for-profit hospital network. The investigators reported no financial conflicts of interest.
Delaying the start of school for as little as 30 minutes not only improved several measures of sleep in adolescents at a boarding school, it also improved depressive symptoms, the motivation and alertness to learn, and even some dietary habits, a study has shown.
“The results of this study add to the growing literature supporting the potential benefits of adjusting school schedules to adolescents' sleep needs, circadian rhythm, and developmental stage and of optimizing sleep and alertness in the learning environment,” wrote Dr. Judith A. Owens of Hasbro Children's Hospital, Providence, R.I., and her associates.
They assessed the impact of delaying the school start time from 8:00 a.m. to 8:30 a.m. at a college-prep boarding and day school in Southern New England for 357 students in grades 9-12. Participating students anonymously completed the eight-page Sleep Habits Survey before (225 students) and after (201 students) a 2-month trial period in which the daily class schedule was delayed for 30 minutes (Arch. Ped. Adolesc. Med. 2010;164:608-14).
The survey covers typical sleep and wake behaviors during the preceding week, sleep- and wake-behavior problems such as difficulty falling asleep and difficulty awakening, depressed mood, and daytime sleepiness under varying conditions.
After the change in school start time, students showed a significant 45-minute increase in sleep duration on school nights. This was due to both waking later on school mornings and going to bed earlier on school nights.
The proportion of students who reported that they rarely or never got enough sleep declined significantly from 69% to 34%, as did the proportion who reported that they “never” got a good night's sleep, which dropped from 29% to 12%.
The percentage of students who got less than 7 hours of sleep on school nights decreased markedly, from 34% to 7%. The percentage who got at least 8 hours of sleep on school nights rose substantially, from 16% to 55%.
Similarly, the percentage of students who reported being bothered by feeling “too tired and unmotivated” to do schoolwork, socialize, or participate in sports much of the time decreased significantly.
Data from the school's health center supported the students' perception that they were less fatigued after school start time was delayed. Significantly more students visited the health center for fatigue-related symptoms before the intervention than afterward, while visits for other medical concerns showed no change.
Data from the school's food services department showed a substantial increase in consumption of healthier foods at breakfast, from from 35 servings per month to 83. Teachers' reports of absences and cases of tardiness at first-period classes decreased by nearly half.
Scores on a measure of depressed mood were significantly negatively correlated with sleep duration on both surveys. After school start time was delayed, the percentage of students who rated themselves as at least somewhat unhappy or depressed decreased significantly from 66% to 45%, as did the percentage who reported feeling irritated or annoyed much of the time (from 84% to 63%).
This benefit in depressive symptoms is particularly noteworthy, “given the recent concerns raised regarding the relationship between insufficient sleep and both depressive symptoms and suicidal ideation in adolescents,” Dr. Owens and her colleagues wrote.
They added that there had been considerable resistance to changing the school start time, voiced primarily by the faculty and athletic coaches. However, once the trial period concluded, “students and faculty overwhelmingly voted” to retain the later start time for the next term.
As one teacher commented, “I have found the 8:30 start to be the single most positive impact to my general quality of life at [the school] since I started 12 years ago.”
The researchers cautioned that this study was limited in that it did not include a control group and relied on retrospective subjective self-reports rather than on objective measures of sleep variables.
Disclosures: The study was sponsored by Lifespan Hospitals of Rhode Island, a not-for-profit hospital network. The investigators reported no financial conflicts of interest.
Myringotomy Tubes Work With Cochlear Implant
Myringotomy tubes can be placed before, during, or after cochlear implants are placed, without putting the patient or the success of the cochlear device at undue risk, according to a study of 62 children.
Moreover, the presence of myringotomy tubes might actually protect the implanted ear if it is still susceptible to recurrent acute otitis media, thus sparing the patient from additional procedures, said Dr. Christopher F. Barañano and his associates at the University of Alabama at Birmingham.
The investigators performed what they described as the first independent study to analyze the overall management of pediatric ears that have both myringotomy tubes and cochlear implants (CIs), during the entire course of implant candidacy, placement, and follow-up.
The role of myringotomy tubes in CI is controversial. Fearing that the tubes raise the risk of complications, “some surgeons strive to avoid myringotomy tubes and to establish tympanic membrane integrity before proceeding with CI, while others treat recurrent acute otitis media with myringotomy tubes before [implantation] despite CI candidacy,” they noted.
Dr. Barañano and his colleagues reviewed the records of 189 CI cases treated at their hospital between 1998 and 2008. They found 62 children (78 ears) with ipsilateral myringotomy tubes. The mean patient age was 3.2 years, and mean follow-up was 58 months.
In 32 ears, the tubes were spontaneously extruded, and in another 14 the tubes were removed before CI was undertaken. Tubes were left intact in the remaining 32 ears at the time of CI.
The researchers found that in 11 ears in which a myringotomy tube had been extruded or removed before CI, the placement of new tubes was soon required to manage recurrent otitis. In addition, in three ears in which myringotomy tubes had been removed before CI, severe otitis with mastoiditis developed within several months. New tubes were inserted, and no further sequelae developed, they said (Arch. Otolaryngol. Head Neck Surg. 2010;136:557-60).
In all, 26 patients developed otorrhea, which resolved with standard outpatient medical therapies. Four patients had perforation of the tympanic membrane.
This low rate of complications “made us realize that we could handle these patients like our other patients with recurrent acute otitis media,” Dr. Barañano and his associates wrote.
Disclosures: No financial conflicts of interest were reported.
Myringotomy tubes can be placed before, during, or after cochlear implants are placed, without putting the patient or the success of the cochlear device at undue risk, according to a study of 62 children.
Moreover, the presence of myringotomy tubes might actually protect the implanted ear if it is still susceptible to recurrent acute otitis media, thus sparing the patient from additional procedures, said Dr. Christopher F. Barañano and his associates at the University of Alabama at Birmingham.
The investigators performed what they described as the first independent study to analyze the overall management of pediatric ears that have both myringotomy tubes and cochlear implants (CIs), during the entire course of implant candidacy, placement, and follow-up.
The role of myringotomy tubes in CI is controversial. Fearing that the tubes raise the risk of complications, “some surgeons strive to avoid myringotomy tubes and to establish tympanic membrane integrity before proceeding with CI, while others treat recurrent acute otitis media with myringotomy tubes before [implantation] despite CI candidacy,” they noted.
Dr. Barañano and his colleagues reviewed the records of 189 CI cases treated at their hospital between 1998 and 2008. They found 62 children (78 ears) with ipsilateral myringotomy tubes. The mean patient age was 3.2 years, and mean follow-up was 58 months.
In 32 ears, the tubes were spontaneously extruded, and in another 14 the tubes were removed before CI was undertaken. Tubes were left intact in the remaining 32 ears at the time of CI.
The researchers found that in 11 ears in which a myringotomy tube had been extruded or removed before CI, the placement of new tubes was soon required to manage recurrent otitis. In addition, in three ears in which myringotomy tubes had been removed before CI, severe otitis with mastoiditis developed within several months. New tubes were inserted, and no further sequelae developed, they said (Arch. Otolaryngol. Head Neck Surg. 2010;136:557-60).
In all, 26 patients developed otorrhea, which resolved with standard outpatient medical therapies. Four patients had perforation of the tympanic membrane.
This low rate of complications “made us realize that we could handle these patients like our other patients with recurrent acute otitis media,” Dr. Barañano and his associates wrote.
Disclosures: No financial conflicts of interest were reported.
Myringotomy tubes can be placed before, during, or after cochlear implants are placed, without putting the patient or the success of the cochlear device at undue risk, according to a study of 62 children.
Moreover, the presence of myringotomy tubes might actually protect the implanted ear if it is still susceptible to recurrent acute otitis media, thus sparing the patient from additional procedures, said Dr. Christopher F. Barañano and his associates at the University of Alabama at Birmingham.
The investigators performed what they described as the first independent study to analyze the overall management of pediatric ears that have both myringotomy tubes and cochlear implants (CIs), during the entire course of implant candidacy, placement, and follow-up.
The role of myringotomy tubes in CI is controversial. Fearing that the tubes raise the risk of complications, “some surgeons strive to avoid myringotomy tubes and to establish tympanic membrane integrity before proceeding with CI, while others treat recurrent acute otitis media with myringotomy tubes before [implantation] despite CI candidacy,” they noted.
Dr. Barañano and his colleagues reviewed the records of 189 CI cases treated at their hospital between 1998 and 2008. They found 62 children (78 ears) with ipsilateral myringotomy tubes. The mean patient age was 3.2 years, and mean follow-up was 58 months.
In 32 ears, the tubes were spontaneously extruded, and in another 14 the tubes were removed before CI was undertaken. Tubes were left intact in the remaining 32 ears at the time of CI.
The researchers found that in 11 ears in which a myringotomy tube had been extruded or removed before CI, the placement of new tubes was soon required to manage recurrent otitis. In addition, in three ears in which myringotomy tubes had been removed before CI, severe otitis with mastoiditis developed within several months. New tubes were inserted, and no further sequelae developed, they said (Arch. Otolaryngol. Head Neck Surg. 2010;136:557-60).
In all, 26 patients developed otorrhea, which resolved with standard outpatient medical therapies. Four patients had perforation of the tympanic membrane.
This low rate of complications “made us realize that we could handle these patients like our other patients with recurrent acute otitis media,” Dr. Barañano and his associates wrote.
Disclosures: No financial conflicts of interest were reported.
Secondhand Smoke May Raise C-Reactive Protein
Healthy adults exposed to high levels of secondhand tobacco smoke show elevated C-reactive protein levels indicating chronic low-grade inflammation, according to a large database study.
The elevated CRP levels partly explain the higher than average risk of cardiovascular death among people exposed to secondhand smoke, said Mark Hamer, Ph.D., of the department of epidemiology and public health at University College, London, and his associates.
Noting that “very few large-scale, population-based studies have collected objective biochemical markers of secondhand smoke exposure with follow-up data on mortality,” the investigators did so using data from the Scottish Health Survey and the Health Survey for England.
Dr. Hamer and his colleagues assessed data on 13,443 men and women aged 35 years and older who were free of cancer and cardiovascular disease at baseline and were followed for an average of 8 years. Exposure to secondhand smoke was determined by measuring salivary cotinine, and blood samples were analyzed for circulating CRP levels.
Approximately 21% of the subjects had high exposure to secondhand smoke.
During follow-up there were 1,221 deaths from all causes and 364 deaths from cardiovascular causes. Both types of mortality were associated with greater exposure to secondhand smoke.
Greater exposure to secondhand smoke also was associated with higher CRP levels, indicating chronic low-grade inflammation. This link “partly explained the elevated risk of CVD [cardiovascular disease] and all-cause death associated with high secondhand smoke” exposure, the investigators said (J. Am. Coll. Cardiol. 2010;56:18–23).
In the subgroup of subjects who had never smoked, the risk of CVD-related death was twice as high among those with greater exposure to secondhand smoke as among those with lesser exposure. However, there was no significant association with CRP in this subgroup, indicating another cause for the excess cardiovascular risk in these subjects.
There also was a significant association between secondhand smoke and all-cause mortality in never-smokers (hazard ratio 1.33) and in ex-smokers (HR 1.14).
In addition, there was no association between secondhand smoke exposure and CVD in the subgroup of ex-smokers. “This might be partly because ex-smokers already have heightened risk of CVD in comparison with never-smokers, thus secondhand smoke exposure might not add to existing risk,” Dr. Hamer and his associates noted.
Disclosures: This study was supported in part by the British Heart Foundation, the Scottish Government Health Directorates, the U. K. Medical Research Council, the U.S. National Institutes of Health, the National Heart, Lung, and Blood Institute, the National Institute on Aging, the Academy of Finland, and Wellcome Trust. No financial conflicts were reported.
Healthy adults exposed to high levels of secondhand tobacco smoke show elevated C-reactive protein levels indicating chronic low-grade inflammation, according to a large database study.
The elevated CRP levels partly explain the higher than average risk of cardiovascular death among people exposed to secondhand smoke, said Mark Hamer, Ph.D., of the department of epidemiology and public health at University College, London, and his associates.
Noting that “very few large-scale, population-based studies have collected objective biochemical markers of secondhand smoke exposure with follow-up data on mortality,” the investigators did so using data from the Scottish Health Survey and the Health Survey for England.
Dr. Hamer and his colleagues assessed data on 13,443 men and women aged 35 years and older who were free of cancer and cardiovascular disease at baseline and were followed for an average of 8 years. Exposure to secondhand smoke was determined by measuring salivary cotinine, and blood samples were analyzed for circulating CRP levels.
Approximately 21% of the subjects had high exposure to secondhand smoke.
During follow-up there were 1,221 deaths from all causes and 364 deaths from cardiovascular causes. Both types of mortality were associated with greater exposure to secondhand smoke.
Greater exposure to secondhand smoke also was associated with higher CRP levels, indicating chronic low-grade inflammation. This link “partly explained the elevated risk of CVD [cardiovascular disease] and all-cause death associated with high secondhand smoke” exposure, the investigators said (J. Am. Coll. Cardiol. 2010;56:18–23).
In the subgroup of subjects who had never smoked, the risk of CVD-related death was twice as high among those with greater exposure to secondhand smoke as among those with lesser exposure. However, there was no significant association with CRP in this subgroup, indicating another cause for the excess cardiovascular risk in these subjects.
There also was a significant association between secondhand smoke and all-cause mortality in never-smokers (hazard ratio 1.33) and in ex-smokers (HR 1.14).
In addition, there was no association between secondhand smoke exposure and CVD in the subgroup of ex-smokers. “This might be partly because ex-smokers already have heightened risk of CVD in comparison with never-smokers, thus secondhand smoke exposure might not add to existing risk,” Dr. Hamer and his associates noted.
Disclosures: This study was supported in part by the British Heart Foundation, the Scottish Government Health Directorates, the U. K. Medical Research Council, the U.S. National Institutes of Health, the National Heart, Lung, and Blood Institute, the National Institute on Aging, the Academy of Finland, and Wellcome Trust. No financial conflicts were reported.
Healthy adults exposed to high levels of secondhand tobacco smoke show elevated C-reactive protein levels indicating chronic low-grade inflammation, according to a large database study.
The elevated CRP levels partly explain the higher than average risk of cardiovascular death among people exposed to secondhand smoke, said Mark Hamer, Ph.D., of the department of epidemiology and public health at University College, London, and his associates.
Noting that “very few large-scale, population-based studies have collected objective biochemical markers of secondhand smoke exposure with follow-up data on mortality,” the investigators did so using data from the Scottish Health Survey and the Health Survey for England.
Dr. Hamer and his colleagues assessed data on 13,443 men and women aged 35 years and older who were free of cancer and cardiovascular disease at baseline and were followed for an average of 8 years. Exposure to secondhand smoke was determined by measuring salivary cotinine, and blood samples were analyzed for circulating CRP levels.
Approximately 21% of the subjects had high exposure to secondhand smoke.
During follow-up there were 1,221 deaths from all causes and 364 deaths from cardiovascular causes. Both types of mortality were associated with greater exposure to secondhand smoke.
Greater exposure to secondhand smoke also was associated with higher CRP levels, indicating chronic low-grade inflammation. This link “partly explained the elevated risk of CVD [cardiovascular disease] and all-cause death associated with high secondhand smoke” exposure, the investigators said (J. Am. Coll. Cardiol. 2010;56:18–23).
In the subgroup of subjects who had never smoked, the risk of CVD-related death was twice as high among those with greater exposure to secondhand smoke as among those with lesser exposure. However, there was no significant association with CRP in this subgroup, indicating another cause for the excess cardiovascular risk in these subjects.
There also was a significant association between secondhand smoke and all-cause mortality in never-smokers (hazard ratio 1.33) and in ex-smokers (HR 1.14).
In addition, there was no association between secondhand smoke exposure and CVD in the subgroup of ex-smokers. “This might be partly because ex-smokers already have heightened risk of CVD in comparison with never-smokers, thus secondhand smoke exposure might not add to existing risk,” Dr. Hamer and his associates noted.
Disclosures: This study was supported in part by the British Heart Foundation, the Scottish Government Health Directorates, the U. K. Medical Research Council, the U.S. National Institutes of Health, the National Heart, Lung, and Blood Institute, the National Institute on Aging, the Academy of Finland, and Wellcome Trust. No financial conflicts were reported.
White Rice Raised Diabetes Risk, Brown Rice Lowered It
Consumption of white rice appears to increase the risk of developing type 2 diabetes, whereas consumption of brown rice appears to decrease that risk.
“Replacing refined grains such as white rice by whole grains, including brown rice, should be recommended to facilitate the prevention of type 2 diabetes,” said Dr. Qi Sun of the Harvard School of Public Health, Boston, and associates.
White rice has a higher glycemic index than does brown rice, and its relationship to type 2 diabetes has been studied in several Asian countries, where it accounts for as much as 75% of the diet. This is the first prospective study to specifically assess the relationship between the disease and the intake of both white and brown rice in a Western population, where white rice accounts for 2% of the diet, Dr. Sun and his colleagues noted.
The researchers used data from three large cohort studies that documented food intake to examine this association, assessing diet and diabetes status in 39,765 men in the HPFS (Health Professionals Follow-Up Study), 69,120 women in the NHS I (Nurses' Health Study I), and 88,343 women in the NHS II.
There were 2,648 incident cases of diabetes during 20 years of follow-up in the HPFS, 5,500 cases during 22 years of follow-up in the NHS I, and 2,359 cases during 14 years of follow-up in the NHS II.
Greater consumption of white rice was linked to a higher risk of diabetes across all three studies. This link was attenuated after the data were adjusted to account for lifestyle and dietary risk factors, “but a trend of increased risk associated with high white rice intake remained,” the researchers said.
Compared with those in the lowest category of white rice intake, “participants who had at least 5 servings of white rice per week had a 17% higher risk of developing type 2 diabetes” (Arch. Intern. Med. 2010;170:961-9).
Greater consumption of brown rice was linked to a lower risk of diabetes. This link was attenuated but remained significant after the data were adjusted to account for risk factors.
“When compared with the participants who ate less than 1 serving of brown rice per month, the pooled risk reduction of type 2 diabetes was 0.89 for intake of 2 or more servings per week,” Dr. Sun and colleagues said.
The study involved working, highly educated health professionals of predominantly European ancestry. The findings may not be generalizable to other populations, they said.
Disclosures: The study was funded by the National Institutes of Health. Dr. Sun, supported by Unilever Corporate Research, reported no financial conflicts.
Consumption of white rice appears to increase the risk of developing type 2 diabetes, whereas consumption of brown rice appears to decrease that risk.
“Replacing refined grains such as white rice by whole grains, including brown rice, should be recommended to facilitate the prevention of type 2 diabetes,” said Dr. Qi Sun of the Harvard School of Public Health, Boston, and associates.
White rice has a higher glycemic index than does brown rice, and its relationship to type 2 diabetes has been studied in several Asian countries, where it accounts for as much as 75% of the diet. This is the first prospective study to specifically assess the relationship between the disease and the intake of both white and brown rice in a Western population, where white rice accounts for 2% of the diet, Dr. Sun and his colleagues noted.
The researchers used data from three large cohort studies that documented food intake to examine this association, assessing diet and diabetes status in 39,765 men in the HPFS (Health Professionals Follow-Up Study), 69,120 women in the NHS I (Nurses' Health Study I), and 88,343 women in the NHS II.
There were 2,648 incident cases of diabetes during 20 years of follow-up in the HPFS, 5,500 cases during 22 years of follow-up in the NHS I, and 2,359 cases during 14 years of follow-up in the NHS II.
Greater consumption of white rice was linked to a higher risk of diabetes across all three studies. This link was attenuated after the data were adjusted to account for lifestyle and dietary risk factors, “but a trend of increased risk associated with high white rice intake remained,” the researchers said.
Compared with those in the lowest category of white rice intake, “participants who had at least 5 servings of white rice per week had a 17% higher risk of developing type 2 diabetes” (Arch. Intern. Med. 2010;170:961-9).
Greater consumption of brown rice was linked to a lower risk of diabetes. This link was attenuated but remained significant after the data were adjusted to account for risk factors.
“When compared with the participants who ate less than 1 serving of brown rice per month, the pooled risk reduction of type 2 diabetes was 0.89 for intake of 2 or more servings per week,” Dr. Sun and colleagues said.
The study involved working, highly educated health professionals of predominantly European ancestry. The findings may not be generalizable to other populations, they said.
Disclosures: The study was funded by the National Institutes of Health. Dr. Sun, supported by Unilever Corporate Research, reported no financial conflicts.
Consumption of white rice appears to increase the risk of developing type 2 diabetes, whereas consumption of brown rice appears to decrease that risk.
“Replacing refined grains such as white rice by whole grains, including brown rice, should be recommended to facilitate the prevention of type 2 diabetes,” said Dr. Qi Sun of the Harvard School of Public Health, Boston, and associates.
White rice has a higher glycemic index than does brown rice, and its relationship to type 2 diabetes has been studied in several Asian countries, where it accounts for as much as 75% of the diet. This is the first prospective study to specifically assess the relationship between the disease and the intake of both white and brown rice in a Western population, where white rice accounts for 2% of the diet, Dr. Sun and his colleagues noted.
The researchers used data from three large cohort studies that documented food intake to examine this association, assessing diet and diabetes status in 39,765 men in the HPFS (Health Professionals Follow-Up Study), 69,120 women in the NHS I (Nurses' Health Study I), and 88,343 women in the NHS II.
There were 2,648 incident cases of diabetes during 20 years of follow-up in the HPFS, 5,500 cases during 22 years of follow-up in the NHS I, and 2,359 cases during 14 years of follow-up in the NHS II.
Greater consumption of white rice was linked to a higher risk of diabetes across all three studies. This link was attenuated after the data were adjusted to account for lifestyle and dietary risk factors, “but a trend of increased risk associated with high white rice intake remained,” the researchers said.
Compared with those in the lowest category of white rice intake, “participants who had at least 5 servings of white rice per week had a 17% higher risk of developing type 2 diabetes” (Arch. Intern. Med. 2010;170:961-9).
Greater consumption of brown rice was linked to a lower risk of diabetes. This link was attenuated but remained significant after the data were adjusted to account for risk factors.
“When compared with the participants who ate less than 1 serving of brown rice per month, the pooled risk reduction of type 2 diabetes was 0.89 for intake of 2 or more servings per week,” Dr. Sun and colleagues said.
The study involved working, highly educated health professionals of predominantly European ancestry. The findings may not be generalizable to other populations, they said.
Disclosures: The study was funded by the National Institutes of Health. Dr. Sun, supported by Unilever Corporate Research, reported no financial conflicts.
Tighter BP Control Not Better for Some Diabetics
Tighter control of systolic blood pressure failed to lower mortality or morbidity beyond what was achieved with usual BP control—and it might even be harmful, according to a secondary analysis involving patients with hypertension, diabetes, and coronary artery disease.
“We have shown for the first time … that decreasing systolic BP to lower than 130 mm Hg” did not reduce morbidity and actually raised all-cause mortality, compared with decreasing systolic BP to lower than 140 mm Hg, said Rhonda M. Cooper-DeHoff, Pharm.D., of the department of pharmacotherapy and translational research at the University of Florida, Gainesville, and her associates.
“At this time, there is no compelling evidence to indicate that lowering systolic BP below 130 mm Hg is beneficial for patients with diabetes; thus, emphasis should be placed on maintaining systolic BP between 130 and 139 mm Hg while focusing on weight loss, healthful eating, and other manifestations of cardiovascular morbidity to further reduce long-term CV risk,” they wrote.
The Joint National Committee on Prevention, Detection, Evaluation, and Treatment of High Blood Pressure, as well as numerous national and international societies, have recommended tight BP control for diabetic patients since the early 1990s. The American Diabetes Association has stated that there is no threshold for lowering blood pressure among diabetics, and the American Heart Association has concurred and expanded its guideline to include patients with cardiovascular disease.
Nevertheless, “there are limited data about patients with diabetes to support such a recommendation for lower systolic BP, particularly in the growing population of those with CAD [coronary artery disease].”
Dr. Cooper-DeHoff and her colleagues performed a post hoc secondary analysis of data from a 22,576-participant randomized controlled trial called the International Verapamil SR-Trandolapril Study (INVEST).
For their analysis, they focused on the 6,400 patients with hypertension, diabetes, and CAD who had been followed closely for 16,893 patient-years. A total of 35% of the participants achieved tight systolic control (less than 130 mm Hg), 31% had usual control (less than 140 mm Hg), and the remaining 34% had uncontrolled systolic BP.
A primary event including all-cause death, nonfatal myocardial infarction, or nonfatal stroke occurred in 12.7% of the tight control group, 12.6% of the usual- care group, and 19.8% of the uncontrolled group. This difference between the treatment groups was not considered significant.
After adjustment, however, all-cause mortality was significantly greater (22.8%) in the tight-control group than in the usual-control group (21.8%). Further analysis showed that a systolic BP of less than 110 mm Hg was associated with a significantly increased risk of death from any cause, the investigators said (JAMA 2010;304:61-7).
“Our data raise the possibility that continued maintenance of systolic BP lower than 130 mm Hg could be hazardous over the long term,” they added.
Recommendations in favor of tight control over the past 20 years were based largely on the findings of two landmark clinical trials, but even in those studies, it is important to note that subjects assigned to the tightest BP control did not achieve their goals, according to the authors. In one study, they achieved a mean blood pressure of 140/81 mm Hg and in the other a mean of 144/82 mm Hg. “The systolic BP associated with the benefit observed in these trials was significantly higher than what is currently recommended for patients with diabetes.
“In fact, many of the major hypertension clinical trials published in the last decade have shown benefit with regard to cardiovascular and nephropathy risk reduction despite mean systolic BP higher than 130 mm Hg,” the researchers added.
Dr. Cooper-DeHoff and her associates acknowledged that the post hoc design of their study may have led to some confounding of their findings and that the results cannot be generalized to patients with diabetes who do not also have cardiovascular disease.
The study was supported by the National Institutes of Health. Dr. Cooper-DeHoff reported receiving funding from Abbott Laboratories. Her associates reported ties to numerous pharmaceutical companies, including several that manufacture hypertension medications.
Tighter control of systolic blood pressure failed to lower mortality or morbidity beyond what was achieved with usual BP control—and it might even be harmful, according to a secondary analysis involving patients with hypertension, diabetes, and coronary artery disease.
“We have shown for the first time … that decreasing systolic BP to lower than 130 mm Hg” did not reduce morbidity and actually raised all-cause mortality, compared with decreasing systolic BP to lower than 140 mm Hg, said Rhonda M. Cooper-DeHoff, Pharm.D., of the department of pharmacotherapy and translational research at the University of Florida, Gainesville, and her associates.
“At this time, there is no compelling evidence to indicate that lowering systolic BP below 130 mm Hg is beneficial for patients with diabetes; thus, emphasis should be placed on maintaining systolic BP between 130 and 139 mm Hg while focusing on weight loss, healthful eating, and other manifestations of cardiovascular morbidity to further reduce long-term CV risk,” they wrote.
The Joint National Committee on Prevention, Detection, Evaluation, and Treatment of High Blood Pressure, as well as numerous national and international societies, have recommended tight BP control for diabetic patients since the early 1990s. The American Diabetes Association has stated that there is no threshold for lowering blood pressure among diabetics, and the American Heart Association has concurred and expanded its guideline to include patients with cardiovascular disease.
Nevertheless, “there are limited data about patients with diabetes to support such a recommendation for lower systolic BP, particularly in the growing population of those with CAD [coronary artery disease].”
Dr. Cooper-DeHoff and her colleagues performed a post hoc secondary analysis of data from a 22,576-participant randomized controlled trial called the International Verapamil SR-Trandolapril Study (INVEST).
For their analysis, they focused on the 6,400 patients with hypertension, diabetes, and CAD who had been followed closely for 16,893 patient-years. A total of 35% of the participants achieved tight systolic control (less than 130 mm Hg), 31% had usual control (less than 140 mm Hg), and the remaining 34% had uncontrolled systolic BP.
A primary event including all-cause death, nonfatal myocardial infarction, or nonfatal stroke occurred in 12.7% of the tight control group, 12.6% of the usual- care group, and 19.8% of the uncontrolled group. This difference between the treatment groups was not considered significant.
After adjustment, however, all-cause mortality was significantly greater (22.8%) in the tight-control group than in the usual-control group (21.8%). Further analysis showed that a systolic BP of less than 110 mm Hg was associated with a significantly increased risk of death from any cause, the investigators said (JAMA 2010;304:61-7).
“Our data raise the possibility that continued maintenance of systolic BP lower than 130 mm Hg could be hazardous over the long term,” they added.
Recommendations in favor of tight control over the past 20 years were based largely on the findings of two landmark clinical trials, but even in those studies, it is important to note that subjects assigned to the tightest BP control did not achieve their goals, according to the authors. In one study, they achieved a mean blood pressure of 140/81 mm Hg and in the other a mean of 144/82 mm Hg. “The systolic BP associated with the benefit observed in these trials was significantly higher than what is currently recommended for patients with diabetes.
“In fact, many of the major hypertension clinical trials published in the last decade have shown benefit with regard to cardiovascular and nephropathy risk reduction despite mean systolic BP higher than 130 mm Hg,” the researchers added.
Dr. Cooper-DeHoff and her associates acknowledged that the post hoc design of their study may have led to some confounding of their findings and that the results cannot be generalized to patients with diabetes who do not also have cardiovascular disease.
The study was supported by the National Institutes of Health. Dr. Cooper-DeHoff reported receiving funding from Abbott Laboratories. Her associates reported ties to numerous pharmaceutical companies, including several that manufacture hypertension medications.
Tighter control of systolic blood pressure failed to lower mortality or morbidity beyond what was achieved with usual BP control—and it might even be harmful, according to a secondary analysis involving patients with hypertension, diabetes, and coronary artery disease.
“We have shown for the first time … that decreasing systolic BP to lower than 130 mm Hg” did not reduce morbidity and actually raised all-cause mortality, compared with decreasing systolic BP to lower than 140 mm Hg, said Rhonda M. Cooper-DeHoff, Pharm.D., of the department of pharmacotherapy and translational research at the University of Florida, Gainesville, and her associates.
“At this time, there is no compelling evidence to indicate that lowering systolic BP below 130 mm Hg is beneficial for patients with diabetes; thus, emphasis should be placed on maintaining systolic BP between 130 and 139 mm Hg while focusing on weight loss, healthful eating, and other manifestations of cardiovascular morbidity to further reduce long-term CV risk,” they wrote.
The Joint National Committee on Prevention, Detection, Evaluation, and Treatment of High Blood Pressure, as well as numerous national and international societies, have recommended tight BP control for diabetic patients since the early 1990s. The American Diabetes Association has stated that there is no threshold for lowering blood pressure among diabetics, and the American Heart Association has concurred and expanded its guideline to include patients with cardiovascular disease.
Nevertheless, “there are limited data about patients with diabetes to support such a recommendation for lower systolic BP, particularly in the growing population of those with CAD [coronary artery disease].”
Dr. Cooper-DeHoff and her colleagues performed a post hoc secondary analysis of data from a 22,576-participant randomized controlled trial called the International Verapamil SR-Trandolapril Study (INVEST).
For their analysis, they focused on the 6,400 patients with hypertension, diabetes, and CAD who had been followed closely for 16,893 patient-years. A total of 35% of the participants achieved tight systolic control (less than 130 mm Hg), 31% had usual control (less than 140 mm Hg), and the remaining 34% had uncontrolled systolic BP.
A primary event including all-cause death, nonfatal myocardial infarction, or nonfatal stroke occurred in 12.7% of the tight control group, 12.6% of the usual- care group, and 19.8% of the uncontrolled group. This difference between the treatment groups was not considered significant.
After adjustment, however, all-cause mortality was significantly greater (22.8%) in the tight-control group than in the usual-control group (21.8%). Further analysis showed that a systolic BP of less than 110 mm Hg was associated with a significantly increased risk of death from any cause, the investigators said (JAMA 2010;304:61-7).
“Our data raise the possibility that continued maintenance of systolic BP lower than 130 mm Hg could be hazardous over the long term,” they added.
Recommendations in favor of tight control over the past 20 years were based largely on the findings of two landmark clinical trials, but even in those studies, it is important to note that subjects assigned to the tightest BP control did not achieve their goals, according to the authors. In one study, they achieved a mean blood pressure of 140/81 mm Hg and in the other a mean of 144/82 mm Hg. “The systolic BP associated with the benefit observed in these trials was significantly higher than what is currently recommended for patients with diabetes.
“In fact, many of the major hypertension clinical trials published in the last decade have shown benefit with regard to cardiovascular and nephropathy risk reduction despite mean systolic BP higher than 130 mm Hg,” the researchers added.
Dr. Cooper-DeHoff and her associates acknowledged that the post hoc design of their study may have led to some confounding of their findings and that the results cannot be generalized to patients with diabetes who do not also have cardiovascular disease.
The study was supported by the National Institutes of Health. Dr. Cooper-DeHoff reported receiving funding from Abbott Laboratories. Her associates reported ties to numerous pharmaceutical companies, including several that manufacture hypertension medications.
PTSD Tied to a Doubling of Veterans' Risk for Dementia
Male veterans with posttraumatic stress disorder appear to be nearly twice as likely to develop dementia as those who do not have PTSD, a study has shown.
The reason for this association is not yet known, nor is it clear whether treatment of PTSD reduces dementia risk. Until more is understood about this newly identified link, it is critical that all patients with PTSD, especially those of advanced age, be followed to screen for cognitive impairment, said Dr. Kristine Yaffe of the University of California, San Francisco, and her associates.
To their knowledge this is the first study of its kind, despite observational reports that older patients with PTSD have been found to show greater declines in cognitive performance than control subjects.
“Given that PTSD symptoms often continue late in life and that alterations in the hypothalamic-pituitary-adrenal axis often accompany PTSD, and these in turn may be associated with dementia, there is reason to believe that PTSD might be associated with accelerated brain aging,” they said.
Dr. Yaffe and her colleagues performed a retrospective cohort study involving 181,093 veterans (96% males), aged 55 and older, who received their medical treatment at VA hospitals nationwide. A total of 53,155 of these patients had received a diagnosis of PTSD, while the remainder had no PTSD.
At baseline, the veterans had a mean age of 69 years and were followed for a mean of 7 years to track the development of incident dementia. This included senile dementias, vascular dementia, Alzheimer's disease, frontotemporal dementia, Lewy body dementia, and dementia “not otherwise specified.”
Patients with PTSD were nearly twice as likely to develop dementia during follow-up as were those without PTSD, with a hazard ratio of 1.77. The cumulative incidence of dementia was about 11% with PTSD, compared with approximately 7% without it, Dr. Yaffe and her associates said (Arch. Gen. Psychiatry 2010;67:608-13).
The link was strong across all dementia subtypes, and it persisted when subjects with diagnoses of clinical depression, substance abuse, or head injury were excluded from the analysis.
This study was limited in that it enrolled male veterans primarily. The study was funded by the U.S. Department of Defense and the National Institute on Aging. Dr. Yaffe and her associates reported ties to Novartis, Zelos Therapeutics, Tethys Bioscience, NPS Pharmaceuticals, Actelion Pharmaceuticals, Sanofi-Aventis, Takeda, the Chatham Institute, and the Pri-Med Institute.
Male veterans with posttraumatic stress disorder appear to be nearly twice as likely to develop dementia as those who do not have PTSD, a study has shown.
The reason for this association is not yet known, nor is it clear whether treatment of PTSD reduces dementia risk. Until more is understood about this newly identified link, it is critical that all patients with PTSD, especially those of advanced age, be followed to screen for cognitive impairment, said Dr. Kristine Yaffe of the University of California, San Francisco, and her associates.
To their knowledge this is the first study of its kind, despite observational reports that older patients with PTSD have been found to show greater declines in cognitive performance than control subjects.
“Given that PTSD symptoms often continue late in life and that alterations in the hypothalamic-pituitary-adrenal axis often accompany PTSD, and these in turn may be associated with dementia, there is reason to believe that PTSD might be associated with accelerated brain aging,” they said.
Dr. Yaffe and her colleagues performed a retrospective cohort study involving 181,093 veterans (96% males), aged 55 and older, who received their medical treatment at VA hospitals nationwide. A total of 53,155 of these patients had received a diagnosis of PTSD, while the remainder had no PTSD.
At baseline, the veterans had a mean age of 69 years and were followed for a mean of 7 years to track the development of incident dementia. This included senile dementias, vascular dementia, Alzheimer's disease, frontotemporal dementia, Lewy body dementia, and dementia “not otherwise specified.”
Patients with PTSD were nearly twice as likely to develop dementia during follow-up as were those without PTSD, with a hazard ratio of 1.77. The cumulative incidence of dementia was about 11% with PTSD, compared with approximately 7% without it, Dr. Yaffe and her associates said (Arch. Gen. Psychiatry 2010;67:608-13).
The link was strong across all dementia subtypes, and it persisted when subjects with diagnoses of clinical depression, substance abuse, or head injury were excluded from the analysis.
This study was limited in that it enrolled male veterans primarily. The study was funded by the U.S. Department of Defense and the National Institute on Aging. Dr. Yaffe and her associates reported ties to Novartis, Zelos Therapeutics, Tethys Bioscience, NPS Pharmaceuticals, Actelion Pharmaceuticals, Sanofi-Aventis, Takeda, the Chatham Institute, and the Pri-Med Institute.
Male veterans with posttraumatic stress disorder appear to be nearly twice as likely to develop dementia as those who do not have PTSD, a study has shown.
The reason for this association is not yet known, nor is it clear whether treatment of PTSD reduces dementia risk. Until more is understood about this newly identified link, it is critical that all patients with PTSD, especially those of advanced age, be followed to screen for cognitive impairment, said Dr. Kristine Yaffe of the University of California, San Francisco, and her associates.
To their knowledge this is the first study of its kind, despite observational reports that older patients with PTSD have been found to show greater declines in cognitive performance than control subjects.
“Given that PTSD symptoms often continue late in life and that alterations in the hypothalamic-pituitary-adrenal axis often accompany PTSD, and these in turn may be associated with dementia, there is reason to believe that PTSD might be associated with accelerated brain aging,” they said.
Dr. Yaffe and her colleagues performed a retrospective cohort study involving 181,093 veterans (96% males), aged 55 and older, who received their medical treatment at VA hospitals nationwide. A total of 53,155 of these patients had received a diagnosis of PTSD, while the remainder had no PTSD.
At baseline, the veterans had a mean age of 69 years and were followed for a mean of 7 years to track the development of incident dementia. This included senile dementias, vascular dementia, Alzheimer's disease, frontotemporal dementia, Lewy body dementia, and dementia “not otherwise specified.”
Patients with PTSD were nearly twice as likely to develop dementia during follow-up as were those without PTSD, with a hazard ratio of 1.77. The cumulative incidence of dementia was about 11% with PTSD, compared with approximately 7% without it, Dr. Yaffe and her associates said (Arch. Gen. Psychiatry 2010;67:608-13).
The link was strong across all dementia subtypes, and it persisted when subjects with diagnoses of clinical depression, substance abuse, or head injury were excluded from the analysis.
This study was limited in that it enrolled male veterans primarily. The study was funded by the U.S. Department of Defense and the National Institute on Aging. Dr. Yaffe and her associates reported ties to Novartis, Zelos Therapeutics, Tethys Bioscience, NPS Pharmaceuticals, Actelion Pharmaceuticals, Sanofi-Aventis, Takeda, the Chatham Institute, and the Pri-Med Institute.
Sipuleucel-T Prolongs Survival in Prostate Cancer
The immunotherapy sipuleucel-T significantly prolonged survival in a study of 512 men with metastatic castration-resistant prostate cancer, confirming the results of two smaller previous trials of this therapeutic “cancer vaccine,” according to the findings of a randomized, placebo-controlled trial.
The experimental treatment increased median survival by 4.1 months and raised the estimated probability of 3-year survival from 23% to 32%, compared with placebo, significant improvements in this population of men with advanced disease, said Dr. Philip W. Kantoff of the Dana-Farber Cancer Institute and Harvard Medical School, Boston, and his coauthors.
As with the previous studies, this trial also showed that sipuleucel-T (Provenge, Dendreon Corp.) did not hinder tumor progression—a paradoxical finding that has yet to be explained, they noted.
Data from the study were pivotal to the Food and Drug Administration's decision earlier this year to approve the immunotherapy for the treatment of asymptomatic or minimally symptomatic castration-resistant prostate cancer.
Dr. Kantoff and his colleagues assessed sipuleucel-T in patients with asymptomatic or minimally symptomatic disease who had an expected survival of at least 6 months. Serum prostate-specific antigen (PSA) levels were 5 ng/mL or more, and serum testosterone levels were less than 50 ng/dL. All had previous androgen-deprivation therapy.
The study subjects were enrolled at 75 medical centers in the United States and Canada, and stratified according to Gleason score, number of bone metastases, and bisphosphonate use.
They were randomly assigned to receive three 1-hour infusions of active drug (341 patients) or placebo (171 patients) every 2 weeks, completing the course of therapy within 1 month. More than 92% of the subjects received all three infusions. Median follow-up was 34 months.
Mortality was approximately 62% with active therapy and 71% with placebo, a relative reduction in the risk of death of 22%. Median survival was approximately 26 months with sipuleucel-T, significantly longer than the 22 months with placebo. Estimated probability of survival at 36 months was approximately 32% with sipuleucel-T, significantly higher than the 23% with placebo.
These benefits were seen across all subgroups of patients, regardless of their status with respect to adverse factors such as high PSA, lactate dehydrogenase, or alkaline phosphatase levels; a greater number of bone metastases; high Gleason score; poor performance status; and the presence of pain.
However, the median time to disease progression, as measured by CT and bone scanning, was not significantly different between the two study groups, at 14.6 weeks for sipuleucel-T and 14.4 weeks for placebo. The reason for this discrepancy is not yet known, but it might be because of “the delayed onset of antitumor responses after active immunotherapy, relative to objective disease progression, which occurred early in this group of patients,” Dr. Kantoff and his associates said (N. Engl. J. Med. 2010;363:411-22).
Sipuleucel-T was generally well tolerated, with only three patients not receiving the entire course of treatment because of infusion-related events. “Adverse events that were more frequently reported for sipuleucel-T than for placebo were generally consistent with the release of cytokines,” such as chills, fever, fatigue, nausea, headache, flu-like illness, and myalgia. Most of these developed within 1 day of an infusion and resolved within 1-2 days. One grade 4 adverse event, a case of bacteremia associated with the catheter infusion, was reported.
There was no increase in the rate of cerebrovascular events, as has been reported previously with sipuleucel-T, the investigators noted.
The immunotherapy sipuleucel-T significantly prolonged survival in a study of 512 men with metastatic castration-resistant prostate cancer, confirming the results of two smaller previous trials of this therapeutic “cancer vaccine,” according to the findings of a randomized, placebo-controlled trial.
The experimental treatment increased median survival by 4.1 months and raised the estimated probability of 3-year survival from 23% to 32%, compared with placebo, significant improvements in this population of men with advanced disease, said Dr. Philip W. Kantoff of the Dana-Farber Cancer Institute and Harvard Medical School, Boston, and his coauthors.
As with the previous studies, this trial also showed that sipuleucel-T (Provenge, Dendreon Corp.) did not hinder tumor progression—a paradoxical finding that has yet to be explained, they noted.
Data from the study were pivotal to the Food and Drug Administration's decision earlier this year to approve the immunotherapy for the treatment of asymptomatic or minimally symptomatic castration-resistant prostate cancer.
Dr. Kantoff and his colleagues assessed sipuleucel-T in patients with asymptomatic or minimally symptomatic disease who had an expected survival of at least 6 months. Serum prostate-specific antigen (PSA) levels were 5 ng/mL or more, and serum testosterone levels were less than 50 ng/dL. All had previous androgen-deprivation therapy.
The study subjects were enrolled at 75 medical centers in the United States and Canada, and stratified according to Gleason score, number of bone metastases, and bisphosphonate use.
They were randomly assigned to receive three 1-hour infusions of active drug (341 patients) or placebo (171 patients) every 2 weeks, completing the course of therapy within 1 month. More than 92% of the subjects received all three infusions. Median follow-up was 34 months.
Mortality was approximately 62% with active therapy and 71% with placebo, a relative reduction in the risk of death of 22%. Median survival was approximately 26 months with sipuleucel-T, significantly longer than the 22 months with placebo. Estimated probability of survival at 36 months was approximately 32% with sipuleucel-T, significantly higher than the 23% with placebo.
These benefits were seen across all subgroups of patients, regardless of their status with respect to adverse factors such as high PSA, lactate dehydrogenase, or alkaline phosphatase levels; a greater number of bone metastases; high Gleason score; poor performance status; and the presence of pain.
However, the median time to disease progression, as measured by CT and bone scanning, was not significantly different between the two study groups, at 14.6 weeks for sipuleucel-T and 14.4 weeks for placebo. The reason for this discrepancy is not yet known, but it might be because of “the delayed onset of antitumor responses after active immunotherapy, relative to objective disease progression, which occurred early in this group of patients,” Dr. Kantoff and his associates said (N. Engl. J. Med. 2010;363:411-22).
Sipuleucel-T was generally well tolerated, with only three patients not receiving the entire course of treatment because of infusion-related events. “Adverse events that were more frequently reported for sipuleucel-T than for placebo were generally consistent with the release of cytokines,” such as chills, fever, fatigue, nausea, headache, flu-like illness, and myalgia. Most of these developed within 1 day of an infusion and resolved within 1-2 days. One grade 4 adverse event, a case of bacteremia associated with the catheter infusion, was reported.
There was no increase in the rate of cerebrovascular events, as has been reported previously with sipuleucel-T, the investigators noted.
The immunotherapy sipuleucel-T significantly prolonged survival in a study of 512 men with metastatic castration-resistant prostate cancer, confirming the results of two smaller previous trials of this therapeutic “cancer vaccine,” according to the findings of a randomized, placebo-controlled trial.
The experimental treatment increased median survival by 4.1 months and raised the estimated probability of 3-year survival from 23% to 32%, compared with placebo, significant improvements in this population of men with advanced disease, said Dr. Philip W. Kantoff of the Dana-Farber Cancer Institute and Harvard Medical School, Boston, and his coauthors.
As with the previous studies, this trial also showed that sipuleucel-T (Provenge, Dendreon Corp.) did not hinder tumor progression—a paradoxical finding that has yet to be explained, they noted.
Data from the study were pivotal to the Food and Drug Administration's decision earlier this year to approve the immunotherapy for the treatment of asymptomatic or minimally symptomatic castration-resistant prostate cancer.
Dr. Kantoff and his colleagues assessed sipuleucel-T in patients with asymptomatic or minimally symptomatic disease who had an expected survival of at least 6 months. Serum prostate-specific antigen (PSA) levels were 5 ng/mL or more, and serum testosterone levels were less than 50 ng/dL. All had previous androgen-deprivation therapy.
The study subjects were enrolled at 75 medical centers in the United States and Canada, and stratified according to Gleason score, number of bone metastases, and bisphosphonate use.
They were randomly assigned to receive three 1-hour infusions of active drug (341 patients) or placebo (171 patients) every 2 weeks, completing the course of therapy within 1 month. More than 92% of the subjects received all three infusions. Median follow-up was 34 months.
Mortality was approximately 62% with active therapy and 71% with placebo, a relative reduction in the risk of death of 22%. Median survival was approximately 26 months with sipuleucel-T, significantly longer than the 22 months with placebo. Estimated probability of survival at 36 months was approximately 32% with sipuleucel-T, significantly higher than the 23% with placebo.
These benefits were seen across all subgroups of patients, regardless of their status with respect to adverse factors such as high PSA, lactate dehydrogenase, or alkaline phosphatase levels; a greater number of bone metastases; high Gleason score; poor performance status; and the presence of pain.
However, the median time to disease progression, as measured by CT and bone scanning, was not significantly different between the two study groups, at 14.6 weeks for sipuleucel-T and 14.4 weeks for placebo. The reason for this discrepancy is not yet known, but it might be because of “the delayed onset of antitumor responses after active immunotherapy, relative to objective disease progression, which occurred early in this group of patients,” Dr. Kantoff and his associates said (N. Engl. J. Med. 2010;363:411-22).
Sipuleucel-T was generally well tolerated, with only three patients not receiving the entire course of treatment because of infusion-related events. “Adverse events that were more frequently reported for sipuleucel-T than for placebo were generally consistent with the release of cytokines,” such as chills, fever, fatigue, nausea, headache, flu-like illness, and myalgia. Most of these developed within 1 day of an infusion and resolved within 1-2 days. One grade 4 adverse event, a case of bacteremia associated with the catheter infusion, was reported.
There was no increase in the rate of cerebrovascular events, as has been reported previously with sipuleucel-T, the investigators noted.