User login
Sharon Worcester is an award-winning medical journalist for MDedge News. She has been with the company since 1996, first as the Southeast Bureau Chief (1996-2009) when the company was known as International Medical News Group, then as a freelance writer (2010-2015) before returning as a reporter in 2015. She previously worked as a daily newspaper reporter covering health and local government. Sharon currently reports primarily on oncology and hematology. She has a BA from Eckerd College and an MA in Mass Communication/Print Journalism from the University of Florida. Connect with her via LinkedIn and follow her on twitter @SW_MedReporter.
Gene Therapy Overrides Clotting Defect in Hemophilia B
A new gene therapy for hemophilia B appeared to override a genetic defect critical for clotting in a combined phase I and II clinical trial involving six patients with severe disease phenotype.
Hemophilia B is an X-linked disorder resulting from a defect in the gene encoding human factor IX (FIX).
In the study, a single intravenous injection of an adenovirus-associated virus (AAV) vector that expresses FIX improved the bleeding phenotype in all six patients, Dr. Amit C. Nathwani of the University College London Cancer Institute and his colleagues report online in the Dec. 10 New England Journal of Medicine.
The findings, which represent a critical step toward the goal of eliminating the need for long-term intravenous infusions of recombinant clotting factor concentrates in hemophilia patients, were presented simultaneously at the annual meeting of the American Society of Hematology.
Patients with a severe hemophilia B disease phenotype have functional FIX levels less than 1% of normal, leading to frequent life-threatening bleeding episodes and crippling joint disease, and are typically treated with expensive prophylactic injections of FIX protein concentrate two to three times weekly.
After injection with the AAV vector (scAAV2/8-LP1-hFIXco), all patients in the study achieved expression of FIX at 2%-12% of normal levels, allowing four of the patients to discontinue FIX prophylaxis. Those patients remained free of spontaneous hemorrhage at up to 16 months of follow-up, even when they undertook activities – such as training for a marathon – that had caused bleeding in the past, the investigators said (N. Engl. J. Med. 2011, Dec. 10 [doi:10.1056/NEJMoa11108046]).
In the other two patients, the interval between FIX prophylaxis injections was increased to one injection every 2 weeks, except in cases of upcoming sporting or other events that required prophylaxis or that resulted in trauma.
The patients were enrolled into one of three cohorts (two patients in each) and were treated with either a high dose (2x1012 vector genomes/kg), intermediate dose (6x1011 vg/kg), or low dose (2x1011 vg/kg) of vector. Response was generally dose-dependent, with the high dose mediating peak expression at the highest levels in the study (8%-12% of normal values). However, one of the two patients treated with the high dose experienced a transient, asymptomatic increase in serum aspartate and alanine aminotransferase levels caused by AAV8-capsid-specific T cells in the peripheral blood, and the other patient in the high-dose group experienced a slight increase (though still below the upper limit of normal) in liver enzyme levels. Short-course glucocorticoid therapy was effective in both patients, normalizing aminotransferase levels without reducing FIX transgene expression levels, the investigators said.
None of the study participants had an immunologic response to the FIX transgene product, they noted.
It is "very important" to determine the frequency of "clinically significant elevations in aminotransferase levels after gene transfer," the investigators wrote, adding that this will require a larger number of participants treated at the high-dose level.
Follow-up of larger numbers of patients and for longer period of time will be required to fully characterize the benefits and risks of this gene therapy, as well as to optimize dosing, but the findings of this study suggest that it has the potential to convert severe hemophilia B into a mild form of disease, and to possibly reverse the disease process entirely, they concluded.
The study was funded by the U.K. Medical Research Council. Dr. Nathwani reported receiving grant funding from the Katharine Dormandy Trust for Haemophilia and Allied Disorders, the U.K. Department of Health, and the National Health Service Blood and Transplant, as well as serving as a consultant for Amsterdam Molecular Therapeutics. The complete list of disclosures for Dr. Nathwani and the other study authors is available with the full text of the article at NEJM.org.
The findings of this landmark study represent the first "unequivocal evidence" of successful gene therapy for hemophilia B and mark a major advance in the field, Dr. Katherine P. Ponder wrote in an accompanying editorial.
The findings suggest that it will be possible to replace the "cumbersome and expensive" protein therapy currently used. Indeed, successful gene therapy could provide substantial cost savings; the annual cost per patient for on-demand protein therapy are about $150,000 and for prophylaxis are about $300,000 for an estimated lifetime cost of more than $20 million, Dr. Ponder said.
This is of particular important for patients in developing countries, where the cost is prohibitive and the disease remains associated with chronic joint disease and early death.
Thus the "truly remarkable finding" that a single injection of an adenovirus-associated virus vector that expresses FIX can successfully treat hemophilia B patients for more than a year has important implications for all hemophilia patients and possibly for patients with other disorders, she said, noting that "this technology may soon translate into applications for other disorders, such as lysosomal storage disease, alpha-1 antitrypsin deficiency, and hyperlipidemias."
Should the practicing hematologist rush to order this gene therapy vector if it is approved? While the risks are not yet completely clear and further study is needed to determine that the approach is safe, the answer, she said, is: "Probably yes."
Dr. Ponder is with the Washington University, St. Louis. Her remarks are summarized from an editorial that accompanied the study, (N. Engl. J. Med. 2011 Dec. 10 [doi:10.1056/NEJMe1111138]). She disclosed receiving grants from the National Institutes of Health to study gene therapy for hemophilia using retroviral vectors. She also has been paid to lecture and/or serve on the speakers bureaus for Shire and Genzyme.
The findings of this landmark study represent the first "unequivocal evidence" of successful gene therapy for hemophilia B and mark a major advance in the field, Dr. Katherine P. Ponder wrote in an accompanying editorial.
The findings suggest that it will be possible to replace the "cumbersome and expensive" protein therapy currently used. Indeed, successful gene therapy could provide substantial cost savings; the annual cost per patient for on-demand protein therapy are about $150,000 and for prophylaxis are about $300,000 for an estimated lifetime cost of more than $20 million, Dr. Ponder said.
This is of particular important for patients in developing countries, where the cost is prohibitive and the disease remains associated with chronic joint disease and early death.
Thus the "truly remarkable finding" that a single injection of an adenovirus-associated virus vector that expresses FIX can successfully treat hemophilia B patients for more than a year has important implications for all hemophilia patients and possibly for patients with other disorders, she said, noting that "this technology may soon translate into applications for other disorders, such as lysosomal storage disease, alpha-1 antitrypsin deficiency, and hyperlipidemias."
Should the practicing hematologist rush to order this gene therapy vector if it is approved? While the risks are not yet completely clear and further study is needed to determine that the approach is safe, the answer, she said, is: "Probably yes."
Dr. Ponder is with the Washington University, St. Louis. Her remarks are summarized from an editorial that accompanied the study, (N. Engl. J. Med. 2011 Dec. 10 [doi:10.1056/NEJMe1111138]). She disclosed receiving grants from the National Institutes of Health to study gene therapy for hemophilia using retroviral vectors. She also has been paid to lecture and/or serve on the speakers bureaus for Shire and Genzyme.
The findings of this landmark study represent the first "unequivocal evidence" of successful gene therapy for hemophilia B and mark a major advance in the field, Dr. Katherine P. Ponder wrote in an accompanying editorial.
The findings suggest that it will be possible to replace the "cumbersome and expensive" protein therapy currently used. Indeed, successful gene therapy could provide substantial cost savings; the annual cost per patient for on-demand protein therapy are about $150,000 and for prophylaxis are about $300,000 for an estimated lifetime cost of more than $20 million, Dr. Ponder said.
This is of particular important for patients in developing countries, where the cost is prohibitive and the disease remains associated with chronic joint disease and early death.
Thus the "truly remarkable finding" that a single injection of an adenovirus-associated virus vector that expresses FIX can successfully treat hemophilia B patients for more than a year has important implications for all hemophilia patients and possibly for patients with other disorders, she said, noting that "this technology may soon translate into applications for other disorders, such as lysosomal storage disease, alpha-1 antitrypsin deficiency, and hyperlipidemias."
Should the practicing hematologist rush to order this gene therapy vector if it is approved? While the risks are not yet completely clear and further study is needed to determine that the approach is safe, the answer, she said, is: "Probably yes."
Dr. Ponder is with the Washington University, St. Louis. Her remarks are summarized from an editorial that accompanied the study, (N. Engl. J. Med. 2011 Dec. 10 [doi:10.1056/NEJMe1111138]). She disclosed receiving grants from the National Institutes of Health to study gene therapy for hemophilia using retroviral vectors. She also has been paid to lecture and/or serve on the speakers bureaus for Shire and Genzyme.
A new gene therapy for hemophilia B appeared to override a genetic defect critical for clotting in a combined phase I and II clinical trial involving six patients with severe disease phenotype.
Hemophilia B is an X-linked disorder resulting from a defect in the gene encoding human factor IX (FIX).
In the study, a single intravenous injection of an adenovirus-associated virus (AAV) vector that expresses FIX improved the bleeding phenotype in all six patients, Dr. Amit C. Nathwani of the University College London Cancer Institute and his colleagues report online in the Dec. 10 New England Journal of Medicine.
The findings, which represent a critical step toward the goal of eliminating the need for long-term intravenous infusions of recombinant clotting factor concentrates in hemophilia patients, were presented simultaneously at the annual meeting of the American Society of Hematology.
Patients with a severe hemophilia B disease phenotype have functional FIX levels less than 1% of normal, leading to frequent life-threatening bleeding episodes and crippling joint disease, and are typically treated with expensive prophylactic injections of FIX protein concentrate two to three times weekly.
After injection with the AAV vector (scAAV2/8-LP1-hFIXco), all patients in the study achieved expression of FIX at 2%-12% of normal levels, allowing four of the patients to discontinue FIX prophylaxis. Those patients remained free of spontaneous hemorrhage at up to 16 months of follow-up, even when they undertook activities – such as training for a marathon – that had caused bleeding in the past, the investigators said (N. Engl. J. Med. 2011, Dec. 10 [doi:10.1056/NEJMoa11108046]).
In the other two patients, the interval between FIX prophylaxis injections was increased to one injection every 2 weeks, except in cases of upcoming sporting or other events that required prophylaxis or that resulted in trauma.
The patients were enrolled into one of three cohorts (two patients in each) and were treated with either a high dose (2x1012 vector genomes/kg), intermediate dose (6x1011 vg/kg), or low dose (2x1011 vg/kg) of vector. Response was generally dose-dependent, with the high dose mediating peak expression at the highest levels in the study (8%-12% of normal values). However, one of the two patients treated with the high dose experienced a transient, asymptomatic increase in serum aspartate and alanine aminotransferase levels caused by AAV8-capsid-specific T cells in the peripheral blood, and the other patient in the high-dose group experienced a slight increase (though still below the upper limit of normal) in liver enzyme levels. Short-course glucocorticoid therapy was effective in both patients, normalizing aminotransferase levels without reducing FIX transgene expression levels, the investigators said.
None of the study participants had an immunologic response to the FIX transgene product, they noted.
It is "very important" to determine the frequency of "clinically significant elevations in aminotransferase levels after gene transfer," the investigators wrote, adding that this will require a larger number of participants treated at the high-dose level.
Follow-up of larger numbers of patients and for longer period of time will be required to fully characterize the benefits and risks of this gene therapy, as well as to optimize dosing, but the findings of this study suggest that it has the potential to convert severe hemophilia B into a mild form of disease, and to possibly reverse the disease process entirely, they concluded.
The study was funded by the U.K. Medical Research Council. Dr. Nathwani reported receiving grant funding from the Katharine Dormandy Trust for Haemophilia and Allied Disorders, the U.K. Department of Health, and the National Health Service Blood and Transplant, as well as serving as a consultant for Amsterdam Molecular Therapeutics. The complete list of disclosures for Dr. Nathwani and the other study authors is available with the full text of the article at NEJM.org.
A new gene therapy for hemophilia B appeared to override a genetic defect critical for clotting in a combined phase I and II clinical trial involving six patients with severe disease phenotype.
Hemophilia B is an X-linked disorder resulting from a defect in the gene encoding human factor IX (FIX).
In the study, a single intravenous injection of an adenovirus-associated virus (AAV) vector that expresses FIX improved the bleeding phenotype in all six patients, Dr. Amit C. Nathwani of the University College London Cancer Institute and his colleagues report online in the Dec. 10 New England Journal of Medicine.
The findings, which represent a critical step toward the goal of eliminating the need for long-term intravenous infusions of recombinant clotting factor concentrates in hemophilia patients, were presented simultaneously at the annual meeting of the American Society of Hematology.
Patients with a severe hemophilia B disease phenotype have functional FIX levels less than 1% of normal, leading to frequent life-threatening bleeding episodes and crippling joint disease, and are typically treated with expensive prophylactic injections of FIX protein concentrate two to three times weekly.
After injection with the AAV vector (scAAV2/8-LP1-hFIXco), all patients in the study achieved expression of FIX at 2%-12% of normal levels, allowing four of the patients to discontinue FIX prophylaxis. Those patients remained free of spontaneous hemorrhage at up to 16 months of follow-up, even when they undertook activities – such as training for a marathon – that had caused bleeding in the past, the investigators said (N. Engl. J. Med. 2011, Dec. 10 [doi:10.1056/NEJMoa11108046]).
In the other two patients, the interval between FIX prophylaxis injections was increased to one injection every 2 weeks, except in cases of upcoming sporting or other events that required prophylaxis or that resulted in trauma.
The patients were enrolled into one of three cohorts (two patients in each) and were treated with either a high dose (2x1012 vector genomes/kg), intermediate dose (6x1011 vg/kg), or low dose (2x1011 vg/kg) of vector. Response was generally dose-dependent, with the high dose mediating peak expression at the highest levels in the study (8%-12% of normal values). However, one of the two patients treated with the high dose experienced a transient, asymptomatic increase in serum aspartate and alanine aminotransferase levels caused by AAV8-capsid-specific T cells in the peripheral blood, and the other patient in the high-dose group experienced a slight increase (though still below the upper limit of normal) in liver enzyme levels. Short-course glucocorticoid therapy was effective in both patients, normalizing aminotransferase levels without reducing FIX transgene expression levels, the investigators said.
None of the study participants had an immunologic response to the FIX transgene product, they noted.
It is "very important" to determine the frequency of "clinically significant elevations in aminotransferase levels after gene transfer," the investigators wrote, adding that this will require a larger number of participants treated at the high-dose level.
Follow-up of larger numbers of patients and for longer period of time will be required to fully characterize the benefits and risks of this gene therapy, as well as to optimize dosing, but the findings of this study suggest that it has the potential to convert severe hemophilia B into a mild form of disease, and to possibly reverse the disease process entirely, they concluded.
The study was funded by the U.K. Medical Research Council. Dr. Nathwani reported receiving grant funding from the Katharine Dormandy Trust for Haemophilia and Allied Disorders, the U.K. Department of Health, and the National Health Service Blood and Transplant, as well as serving as a consultant for Amsterdam Molecular Therapeutics. The complete list of disclosures for Dr. Nathwani and the other study authors is available with the full text of the article at NEJM.org.
FROM THE NEW ENGLAND JOURNAL OF MEDICINE
Major Finding: After injection with scAAV2/8-LP1-hFIXco, all patients in this study achieved AAV-mediated expression of FIX at 2%-12% of normal levels, allowing four of the patients to discontinue FIX prophylaxis. Those patients remained free of spontaneous hemorrhage at up to 16 months.
Data Source: A combined phase I and II clinical trial.
Disclosures: This study was funded by the U.K. Medical Research Council. Dr. Nathwani reported receiving grant funding from the Katharine Dormandy Trust for Haemophilia and Allied Disorders, the U.K. Department of Health, and the U.K. National Health Service Blood and Transplant and served as a consultant for Amsterdam Molecular Therapeutics. The complete list of disclosures for Dr. Nathwani and the other study authors is available with the full text of the article at NEJM.org.
Oxytocin System Functioning Mediates Effects of Maternal Depression
Children exposed to maternal depression throughout their first year of life are more likely than nonexposed children to develop mental disorders by age 6 years, particularly if their oxytocin system functioning is disordered, according to findings from a longitudinal study of 155 mother-child pairs.
Of the children in the study who were exposed to depression throughout their first year, 60% exhibited mental disorders by age 6 years, compared with only 15% of those born to mothers with no depression or other mental disorders, Ruth Feldman, Ph.D., reported at the annual meeting of the American College of Neuropsychopharmacology.
Anxiety disorders and conduct disorders were the most common conditions exhibited by the exposed children, although depression is likely to show more prominently at the next assessment when the children are aged 9 or 10 years, noted Dr. Feldman of Bar-llan University in Ramat Gan, Israel.
The exposed children also demonstrated lower social engagement with their mothers, lower playfulness and creativity, and diminished social involvement, compared with non-exposed children, and also were less verbal and expressed less empathy to the pain, suffering, and embarrassment of strangers, she said at a press briefing held in conjunction with the meeting.
Like their mothers, who had an increased likelihood of having disordered oxytocin functioning and who produced less peripheral oxytocin in their saliva, the children were found to have disordered functioning of the oxytocin system and lower salivary oxytocin levels.
The depressed mothers, as well as their children, had a threefold greater prevalence of a risky variant (a variant with two "G" alleles) of the oxytocin receptor gene.
Of note, the 40% of exposed children who did not develop a mental disorder by age 6 years demonstrated more normal functioning of the oxytocin system, and they had better social engagements and higher levels of empathy, Dr. Feldman said.
Furthermore, these children were born to women who had less disruption of the oxytocin system and a less risky variant (the "A" allele variant) of the oxytocin receptor gene, as well as typical levels of oxytocin in their saliva.
These women, despite their depression, had better emotional skills and an improved capacity for providing adequate care, Dr. Feldman said.
It appears that a properly functioning oxytocin system offers protection against the effects of chronic maternal depression in some children, she added.
Study participants were recruited from a larger sample of nearly 2,000 mothers who participated in a mental health survey when they delivered, and again at 6 and 9 months after delivery. The oxytocin measures and in-home observations between parents and children were conducted in those who participated in this portion of the study. In addition to the 20% of participants who had depression throughout the first postpartum year, 4% were diagnosed with subclinical depression and 4% with subclinical anxiety, and 62% had no signs of mental disorders or symptoms during the first year.
The findings are of interest because the oxytocin system is an open system with cross-generation effects, Dr. Feldman said, explaining that if the system is known to be disrupted – in cases of postpartum depression, for example – oxytocin-related interventions could be provided. Mothers could be instructed to increase maternal touch and gaze, or intranasal oxytocin could be administered, for example.
Such interventions could provide a protective barrier against some of the psychopathologies associated with maternal depression. Indeed, intranasal oxytocin administration to both infants and fathers (whose oxytocin levels also were shown in this study to be lower in the setting of maternal depression), was shown to improve vagal tone, duration of social engagement behavior, and to markedly increase salivary oxytocin, Dr. Feldman concluded.
She reported no disclosures.
conduct disorder in children
Children exposed to maternal depression throughout their first year of life are more likely than nonexposed children to develop mental disorders by age 6 years, particularly if their oxytocin system functioning is disordered, according to findings from a longitudinal study of 155 mother-child pairs.
Of the children in the study who were exposed to depression throughout their first year, 60% exhibited mental disorders by age 6 years, compared with only 15% of those born to mothers with no depression or other mental disorders, Ruth Feldman, Ph.D., reported at the annual meeting of the American College of Neuropsychopharmacology.
Anxiety disorders and conduct disorders were the most common conditions exhibited by the exposed children, although depression is likely to show more prominently at the next assessment when the children are aged 9 or 10 years, noted Dr. Feldman of Bar-llan University in Ramat Gan, Israel.
The exposed children also demonstrated lower social engagement with their mothers, lower playfulness and creativity, and diminished social involvement, compared with non-exposed children, and also were less verbal and expressed less empathy to the pain, suffering, and embarrassment of strangers, she said at a press briefing held in conjunction with the meeting.
Like their mothers, who had an increased likelihood of having disordered oxytocin functioning and who produced less peripheral oxytocin in their saliva, the children were found to have disordered functioning of the oxytocin system and lower salivary oxytocin levels.
The depressed mothers, as well as their children, had a threefold greater prevalence of a risky variant (a variant with two "G" alleles) of the oxytocin receptor gene.
Of note, the 40% of exposed children who did not develop a mental disorder by age 6 years demonstrated more normal functioning of the oxytocin system, and they had better social engagements and higher levels of empathy, Dr. Feldman said.
Furthermore, these children were born to women who had less disruption of the oxytocin system and a less risky variant (the "A" allele variant) of the oxytocin receptor gene, as well as typical levels of oxytocin in their saliva.
These women, despite their depression, had better emotional skills and an improved capacity for providing adequate care, Dr. Feldman said.
It appears that a properly functioning oxytocin system offers protection against the effects of chronic maternal depression in some children, she added.
Study participants were recruited from a larger sample of nearly 2,000 mothers who participated in a mental health survey when they delivered, and again at 6 and 9 months after delivery. The oxytocin measures and in-home observations between parents and children were conducted in those who participated in this portion of the study. In addition to the 20% of participants who had depression throughout the first postpartum year, 4% were diagnosed with subclinical depression and 4% with subclinical anxiety, and 62% had no signs of mental disorders or symptoms during the first year.
The findings are of interest because the oxytocin system is an open system with cross-generation effects, Dr. Feldman said, explaining that if the system is known to be disrupted – in cases of postpartum depression, for example – oxytocin-related interventions could be provided. Mothers could be instructed to increase maternal touch and gaze, or intranasal oxytocin could be administered, for example.
Such interventions could provide a protective barrier against some of the psychopathologies associated with maternal depression. Indeed, intranasal oxytocin administration to both infants and fathers (whose oxytocin levels also were shown in this study to be lower in the setting of maternal depression), was shown to improve vagal tone, duration of social engagement behavior, and to markedly increase salivary oxytocin, Dr. Feldman concluded.
She reported no disclosures.
Children exposed to maternal depression throughout their first year of life are more likely than nonexposed children to develop mental disorders by age 6 years, particularly if their oxytocin system functioning is disordered, according to findings from a longitudinal study of 155 mother-child pairs.
Of the children in the study who were exposed to depression throughout their first year, 60% exhibited mental disorders by age 6 years, compared with only 15% of those born to mothers with no depression or other mental disorders, Ruth Feldman, Ph.D., reported at the annual meeting of the American College of Neuropsychopharmacology.
Anxiety disorders and conduct disorders were the most common conditions exhibited by the exposed children, although depression is likely to show more prominently at the next assessment when the children are aged 9 or 10 years, noted Dr. Feldman of Bar-llan University in Ramat Gan, Israel.
The exposed children also demonstrated lower social engagement with their mothers, lower playfulness and creativity, and diminished social involvement, compared with non-exposed children, and also were less verbal and expressed less empathy to the pain, suffering, and embarrassment of strangers, she said at a press briefing held in conjunction with the meeting.
Like their mothers, who had an increased likelihood of having disordered oxytocin functioning and who produced less peripheral oxytocin in their saliva, the children were found to have disordered functioning of the oxytocin system and lower salivary oxytocin levels.
The depressed mothers, as well as their children, had a threefold greater prevalence of a risky variant (a variant with two "G" alleles) of the oxytocin receptor gene.
Of note, the 40% of exposed children who did not develop a mental disorder by age 6 years demonstrated more normal functioning of the oxytocin system, and they had better social engagements and higher levels of empathy, Dr. Feldman said.
Furthermore, these children were born to women who had less disruption of the oxytocin system and a less risky variant (the "A" allele variant) of the oxytocin receptor gene, as well as typical levels of oxytocin in their saliva.
These women, despite their depression, had better emotional skills and an improved capacity for providing adequate care, Dr. Feldman said.
It appears that a properly functioning oxytocin system offers protection against the effects of chronic maternal depression in some children, she added.
Study participants were recruited from a larger sample of nearly 2,000 mothers who participated in a mental health survey when they delivered, and again at 6 and 9 months after delivery. The oxytocin measures and in-home observations between parents and children were conducted in those who participated in this portion of the study. In addition to the 20% of participants who had depression throughout the first postpartum year, 4% were diagnosed with subclinical depression and 4% with subclinical anxiety, and 62% had no signs of mental disorders or symptoms during the first year.
The findings are of interest because the oxytocin system is an open system with cross-generation effects, Dr. Feldman said, explaining that if the system is known to be disrupted – in cases of postpartum depression, for example – oxytocin-related interventions could be provided. Mothers could be instructed to increase maternal touch and gaze, or intranasal oxytocin could be administered, for example.
Such interventions could provide a protective barrier against some of the psychopathologies associated with maternal depression. Indeed, intranasal oxytocin administration to both infants and fathers (whose oxytocin levels also were shown in this study to be lower in the setting of maternal depression), was shown to improve vagal tone, duration of social engagement behavior, and to markedly increase salivary oxytocin, Dr. Feldman concluded.
She reported no disclosures.
conduct disorder in children
conduct disorder in children
FROM THE ANNUAL MEETING OF THE AMERICAN COLLEGE OF NEUROPSYCHO-PHARMACOLOGY
Major Finding: Of the children in the study who were exposed to depression throughout their first year of life, 60% exhibited mental disorders by age 6 years, compared with only 15% of those born to mothers with no depression or other mental disorders. Of note, the 40% of exposed children who did not develop a mental disorder by age 6 years demonstrated more normal functioning of the oxytocin system, as did their mothers (despite their depression).
Data Source: A prospective longitudinal study of 155 mother-child pairs.
Disclosures: Dr. Feldman reported no disclosures.
IOM Dissects Environmental Risk Factors for Breast Cancer
SAN ANTONIO – Women may reduce their risk of breast cancer by avoiding unnecessary medical radiation throughout life, avoiding the use of combined estrogen-progestin hormone therapy after menopause, avoiding smoking, limiting alcohol intake, and increasing physical activity level, according to a report by the Institute of Medicine’s Committee on Breast Cancer and the Environment.
The committee was convened in response to a request by Susan G. Komen for the Cure to review evidence on the contribution of environmental exposures to the development of breast cancer. "Environment" was broadly interpreted to include all nongenetic contributors to breast cancer development, from growth patterns to chemical and microbial exposures to social and cultural practices across the lifespan.
The most consistent evidence backs a link between breast cancer and combined hormone therapy, exposure to ionizing radiation, excess weight after menopause, and alcohol consumption. The evidence regarding smoking and breast cancer is less consistent, with some studies showing a causal relationship and others showing limited evidence of a relationship, according to the findings, which were reported during a press briefing held in conjunction with the San Antonio Breast Cancer Symposium.
Evidence is particularly conflicting with regard to physical activity, personal use of hair dyes, and exposure to non-ionizing radiation such as that emitted by microwave ovens and other electrical devices.
Possible associations with even less persuasive evidence include secondhand smoke exposure, nighttime shift work (possibly through disruptions to circadian rhythm), and exposures to benzene, ethylene oxide, and 1,3-butadiene. Exposure to bisphenol A (BPA) presents a "plausible hazard" for which little data exist.
In general, environmental factors found to have any possible link with breast cancer development were associated with less than a doubling of risk.
For an individual woman, the potential risk reduction from avoiding environmental factors would vary, and "may be small or may be moderate," committee chair, Irva Hertz-Picciotto, Ph.D., a professor at the school of medicine at the University of California, Davis, said during the press briefing.
Nonetheless, the impact of risk factor avoidance could be important at a population level, according to the report.
The IOM committee focused on initial breast cancer occurrence, taking into account changes in the breast over a woman’s lifetime, as well as the potential influence of the timing of certain exposures. Diagnosis, treatment, and screening practices were not addressed.
The committee analyzed evidence amassed by the International Agency for Research on Cancer, the World Cancer Research Fund International, and other authoritative organizations. Those data were supplemented by reviews and original research reports from the peer-reviewed literature.
Evidence reviewed primarily focused on exposure during adulthood so the committee was unable to address the effects of various exposures across the life course. Also, many chemical exposures have never been studied in regard to their association with breast cancer.
Furthermore, the contribution of genetic factors and potential gene-environment associations are difficult to assess, the committee conceded.
Topics considered high priority for further research include the role of shift work, endocrine activity, and genotoxicity. Furthermore, research is needed on the "biologic significance of life stages at which environmental risk factors are encountered, what steps may counter their effects, when preventive actions can be most effective, and whether opportunities for prevention can be found for the variety of forms of breast cancer," according to the report.
The report provides a number of strategies for counseling women about how to best prevent breast cancer, according to Dr. Robert Hiatt, a committee member and deputy director of the Comprehensive Cancer Center at the University of California, San Francisco. For example, doctors need to address with their patients the issue of ionizing radiation exposure.
The IOM report was supported by a contract between the National Academy of Sciences and Susan G. Komen for the Cure. Individual authors had no conflicts of interest, according to NAS protocols.
SAN ANTONIO – Women may reduce their risk of breast cancer by avoiding unnecessary medical radiation throughout life, avoiding the use of combined estrogen-progestin hormone therapy after menopause, avoiding smoking, limiting alcohol intake, and increasing physical activity level, according to a report by the Institute of Medicine’s Committee on Breast Cancer and the Environment.
The committee was convened in response to a request by Susan G. Komen for the Cure to review evidence on the contribution of environmental exposures to the development of breast cancer. "Environment" was broadly interpreted to include all nongenetic contributors to breast cancer development, from growth patterns to chemical and microbial exposures to social and cultural practices across the lifespan.
The most consistent evidence backs a link between breast cancer and combined hormone therapy, exposure to ionizing radiation, excess weight after menopause, and alcohol consumption. The evidence regarding smoking and breast cancer is less consistent, with some studies showing a causal relationship and others showing limited evidence of a relationship, according to the findings, which were reported during a press briefing held in conjunction with the San Antonio Breast Cancer Symposium.
Evidence is particularly conflicting with regard to physical activity, personal use of hair dyes, and exposure to non-ionizing radiation such as that emitted by microwave ovens and other electrical devices.
Possible associations with even less persuasive evidence include secondhand smoke exposure, nighttime shift work (possibly through disruptions to circadian rhythm), and exposures to benzene, ethylene oxide, and 1,3-butadiene. Exposure to bisphenol A (BPA) presents a "plausible hazard" for which little data exist.
In general, environmental factors found to have any possible link with breast cancer development were associated with less than a doubling of risk.
For an individual woman, the potential risk reduction from avoiding environmental factors would vary, and "may be small or may be moderate," committee chair, Irva Hertz-Picciotto, Ph.D., a professor at the school of medicine at the University of California, Davis, said during the press briefing.
Nonetheless, the impact of risk factor avoidance could be important at a population level, according to the report.
The IOM committee focused on initial breast cancer occurrence, taking into account changes in the breast over a woman’s lifetime, as well as the potential influence of the timing of certain exposures. Diagnosis, treatment, and screening practices were not addressed.
The committee analyzed evidence amassed by the International Agency for Research on Cancer, the World Cancer Research Fund International, and other authoritative organizations. Those data were supplemented by reviews and original research reports from the peer-reviewed literature.
Evidence reviewed primarily focused on exposure during adulthood so the committee was unable to address the effects of various exposures across the life course. Also, many chemical exposures have never been studied in regard to their association with breast cancer.
Furthermore, the contribution of genetic factors and potential gene-environment associations are difficult to assess, the committee conceded.
Topics considered high priority for further research include the role of shift work, endocrine activity, and genotoxicity. Furthermore, research is needed on the "biologic significance of life stages at which environmental risk factors are encountered, what steps may counter their effects, when preventive actions can be most effective, and whether opportunities for prevention can be found for the variety of forms of breast cancer," according to the report.
The report provides a number of strategies for counseling women about how to best prevent breast cancer, according to Dr. Robert Hiatt, a committee member and deputy director of the Comprehensive Cancer Center at the University of California, San Francisco. For example, doctors need to address with their patients the issue of ionizing radiation exposure.
The IOM report was supported by a contract between the National Academy of Sciences and Susan G. Komen for the Cure. Individual authors had no conflicts of interest, according to NAS protocols.
SAN ANTONIO – Women may reduce their risk of breast cancer by avoiding unnecessary medical radiation throughout life, avoiding the use of combined estrogen-progestin hormone therapy after menopause, avoiding smoking, limiting alcohol intake, and increasing physical activity level, according to a report by the Institute of Medicine’s Committee on Breast Cancer and the Environment.
The committee was convened in response to a request by Susan G. Komen for the Cure to review evidence on the contribution of environmental exposures to the development of breast cancer. "Environment" was broadly interpreted to include all nongenetic contributors to breast cancer development, from growth patterns to chemical and microbial exposures to social and cultural practices across the lifespan.
The most consistent evidence backs a link between breast cancer and combined hormone therapy, exposure to ionizing radiation, excess weight after menopause, and alcohol consumption. The evidence regarding smoking and breast cancer is less consistent, with some studies showing a causal relationship and others showing limited evidence of a relationship, according to the findings, which were reported during a press briefing held in conjunction with the San Antonio Breast Cancer Symposium.
Evidence is particularly conflicting with regard to physical activity, personal use of hair dyes, and exposure to non-ionizing radiation such as that emitted by microwave ovens and other electrical devices.
Possible associations with even less persuasive evidence include secondhand smoke exposure, nighttime shift work (possibly through disruptions to circadian rhythm), and exposures to benzene, ethylene oxide, and 1,3-butadiene. Exposure to bisphenol A (BPA) presents a "plausible hazard" for which little data exist.
In general, environmental factors found to have any possible link with breast cancer development were associated with less than a doubling of risk.
For an individual woman, the potential risk reduction from avoiding environmental factors would vary, and "may be small or may be moderate," committee chair, Irva Hertz-Picciotto, Ph.D., a professor at the school of medicine at the University of California, Davis, said during the press briefing.
Nonetheless, the impact of risk factor avoidance could be important at a population level, according to the report.
The IOM committee focused on initial breast cancer occurrence, taking into account changes in the breast over a woman’s lifetime, as well as the potential influence of the timing of certain exposures. Diagnosis, treatment, and screening practices were not addressed.
The committee analyzed evidence amassed by the International Agency for Research on Cancer, the World Cancer Research Fund International, and other authoritative organizations. Those data were supplemented by reviews and original research reports from the peer-reviewed literature.
Evidence reviewed primarily focused on exposure during adulthood so the committee was unable to address the effects of various exposures across the life course. Also, many chemical exposures have never been studied in regard to their association with breast cancer.
Furthermore, the contribution of genetic factors and potential gene-environment associations are difficult to assess, the committee conceded.
Topics considered high priority for further research include the role of shift work, endocrine activity, and genotoxicity. Furthermore, research is needed on the "biologic significance of life stages at which environmental risk factors are encountered, what steps may counter their effects, when preventive actions can be most effective, and whether opportunities for prevention can be found for the variety of forms of breast cancer," according to the report.
The report provides a number of strategies for counseling women about how to best prevent breast cancer, according to Dr. Robert Hiatt, a committee member and deputy director of the Comprehensive Cancer Center at the University of California, San Francisco. For example, doctors need to address with their patients the issue of ionizing radiation exposure.
The IOM report was supported by a contract between the National Academy of Sciences and Susan G. Komen for the Cure. Individual authors had no conflicts of interest, according to NAS protocols.
FROM THE SAN ANTONIO BREAST CANCER SYMPOSIUM
Study Provides More Evidence of Autism Immune Component
Evidence increasingly supports the notion of an autoimmune version of autism, and a new study involving specific autoantibodies that are directed at fetal brain tissue and that are found in a modest proportion of mothers with an autistic child, bolsters this theory.
In an earlier study, researchers at the Medical Investigation of Neurodevelopmental Disorders (MIND) Institute at the University of California, Davis, demonstrated that 12% of women with an autistic child had "unusual" antibodies not present in mothers of typically developing children or in mothers of children with other intellectual developmental disorders. Since this raised the hypothesis that the antibodies, which were immunoglobulin G and thus cross the placenta, might be interacting with the fetal brain, leading to disregulation of development (and ultimately to autism), the researchers expanded their study by testing the effects of the antibodies in pregnant Rhesus monkeys.
They found that the offspring of monkeys injected with the IgG showed distinctive autistic characteristics, David G. Amaral, Ph.D., research director at the MIND Institute, reported at the annual meeting of the American College of Neuropsychopharmacology.
Specifically, pregnant monkeys were injected over a 6-week period with either purified autoantibodies to fetal brain proteins from the blood of the mothers of children with autism, or with autoantibodies from mothers with typically developing children. The offspring of the monkeys injected with autoantibodies from mothers with an autistic child – but not those injected with samples from mothers of typically developing children, demonstrated social impairment and stereotypic behaviors across several behavioral testing paradigms, Dr. Amaral said during a press briefing held in conjunction with the meeting.
The social impairment as detected by blinded investigators was subtle and did not reach the level of social impairment consistent with autism, but the stereotypy was profound.
"Given that (stereotypy) is one of the clinical signs of autism, we thought this was intriguing," he said, adding: "The ability to reproduce this effect in an animal model was strong evidence that these antibodies may have a disease-causing effect."
Dr. Amaral and his colleagues have replicated these findings in two independent studies, and are currently extending their analysis to a magnetic resonance imaging study of brain development in the treated monkeys. In other prior research by the MIND Institute investigators, a substantial proportion of boys with autism have been shown to have precocious brain growth during early childhood, and the MRI studies are designed to determine if similar patterns of brain development occur in the treated Rhesus monkeys.
If confirmed, the findings of this study could lead to screening tests for pregnant mothers, and perhaps to preventive measures for certain types of autism.
For example, if the researchers successfully segment an autoimmune version of autism as a uniform, homogenous version, unique preventive measures and treatment options could be developed, Dr. Amaral explained.
Currently an estimated 1% of children in the United States have an autism-spectrum disorder, and the probability of a mother with an autistic child having a subsequent child with autism is 25% if the subsequent child is a boy, and 9% if the child is a girl. The ability to test for antibodies against fetal brain tissue in pregnant mothers who already have a child with autism could lead to earlier recognition of the disorder in the second child, and to interventions that prevent or reduce the effects of the harmful antibodies, he said.
"What’s intriguing about this line of work is that in the majority of autism we don’t have a lot of targets for intervention or prevention, whereas if this were to be replicated and pan out, these antibodies are quite easy to identify," he added.
Dr. Amaral said he had no disclosures.
Evidence increasingly supports the notion of an autoimmune version of autism, and a new study involving specific autoantibodies that are directed at fetal brain tissue and that are found in a modest proportion of mothers with an autistic child, bolsters this theory.
In an earlier study, researchers at the Medical Investigation of Neurodevelopmental Disorders (MIND) Institute at the University of California, Davis, demonstrated that 12% of women with an autistic child had "unusual" antibodies not present in mothers of typically developing children or in mothers of children with other intellectual developmental disorders. Since this raised the hypothesis that the antibodies, which were immunoglobulin G and thus cross the placenta, might be interacting with the fetal brain, leading to disregulation of development (and ultimately to autism), the researchers expanded their study by testing the effects of the antibodies in pregnant Rhesus monkeys.
They found that the offspring of monkeys injected with the IgG showed distinctive autistic characteristics, David G. Amaral, Ph.D., research director at the MIND Institute, reported at the annual meeting of the American College of Neuropsychopharmacology.
Specifically, pregnant monkeys were injected over a 6-week period with either purified autoantibodies to fetal brain proteins from the blood of the mothers of children with autism, or with autoantibodies from mothers with typically developing children. The offspring of the monkeys injected with autoantibodies from mothers with an autistic child – but not those injected with samples from mothers of typically developing children, demonstrated social impairment and stereotypic behaviors across several behavioral testing paradigms, Dr. Amaral said during a press briefing held in conjunction with the meeting.
The social impairment as detected by blinded investigators was subtle and did not reach the level of social impairment consistent with autism, but the stereotypy was profound.
"Given that (stereotypy) is one of the clinical signs of autism, we thought this was intriguing," he said, adding: "The ability to reproduce this effect in an animal model was strong evidence that these antibodies may have a disease-causing effect."
Dr. Amaral and his colleagues have replicated these findings in two independent studies, and are currently extending their analysis to a magnetic resonance imaging study of brain development in the treated monkeys. In other prior research by the MIND Institute investigators, a substantial proportion of boys with autism have been shown to have precocious brain growth during early childhood, and the MRI studies are designed to determine if similar patterns of brain development occur in the treated Rhesus monkeys.
If confirmed, the findings of this study could lead to screening tests for pregnant mothers, and perhaps to preventive measures for certain types of autism.
For example, if the researchers successfully segment an autoimmune version of autism as a uniform, homogenous version, unique preventive measures and treatment options could be developed, Dr. Amaral explained.
Currently an estimated 1% of children in the United States have an autism-spectrum disorder, and the probability of a mother with an autistic child having a subsequent child with autism is 25% if the subsequent child is a boy, and 9% if the child is a girl. The ability to test for antibodies against fetal brain tissue in pregnant mothers who already have a child with autism could lead to earlier recognition of the disorder in the second child, and to interventions that prevent or reduce the effects of the harmful antibodies, he said.
"What’s intriguing about this line of work is that in the majority of autism we don’t have a lot of targets for intervention or prevention, whereas if this were to be replicated and pan out, these antibodies are quite easy to identify," he added.
Dr. Amaral said he had no disclosures.
Evidence increasingly supports the notion of an autoimmune version of autism, and a new study involving specific autoantibodies that are directed at fetal brain tissue and that are found in a modest proportion of mothers with an autistic child, bolsters this theory.
In an earlier study, researchers at the Medical Investigation of Neurodevelopmental Disorders (MIND) Institute at the University of California, Davis, demonstrated that 12% of women with an autistic child had "unusual" antibodies not present in mothers of typically developing children or in mothers of children with other intellectual developmental disorders. Since this raised the hypothesis that the antibodies, which were immunoglobulin G and thus cross the placenta, might be interacting with the fetal brain, leading to disregulation of development (and ultimately to autism), the researchers expanded their study by testing the effects of the antibodies in pregnant Rhesus monkeys.
They found that the offspring of monkeys injected with the IgG showed distinctive autistic characteristics, David G. Amaral, Ph.D., research director at the MIND Institute, reported at the annual meeting of the American College of Neuropsychopharmacology.
Specifically, pregnant monkeys were injected over a 6-week period with either purified autoantibodies to fetal brain proteins from the blood of the mothers of children with autism, or with autoantibodies from mothers with typically developing children. The offspring of the monkeys injected with autoantibodies from mothers with an autistic child – but not those injected with samples from mothers of typically developing children, demonstrated social impairment and stereotypic behaviors across several behavioral testing paradigms, Dr. Amaral said during a press briefing held in conjunction with the meeting.
The social impairment as detected by blinded investigators was subtle and did not reach the level of social impairment consistent with autism, but the stereotypy was profound.
"Given that (stereotypy) is one of the clinical signs of autism, we thought this was intriguing," he said, adding: "The ability to reproduce this effect in an animal model was strong evidence that these antibodies may have a disease-causing effect."
Dr. Amaral and his colleagues have replicated these findings in two independent studies, and are currently extending their analysis to a magnetic resonance imaging study of brain development in the treated monkeys. In other prior research by the MIND Institute investigators, a substantial proportion of boys with autism have been shown to have precocious brain growth during early childhood, and the MRI studies are designed to determine if similar patterns of brain development occur in the treated Rhesus monkeys.
If confirmed, the findings of this study could lead to screening tests for pregnant mothers, and perhaps to preventive measures for certain types of autism.
For example, if the researchers successfully segment an autoimmune version of autism as a uniform, homogenous version, unique preventive measures and treatment options could be developed, Dr. Amaral explained.
Currently an estimated 1% of children in the United States have an autism-spectrum disorder, and the probability of a mother with an autistic child having a subsequent child with autism is 25% if the subsequent child is a boy, and 9% if the child is a girl. The ability to test for antibodies against fetal brain tissue in pregnant mothers who already have a child with autism could lead to earlier recognition of the disorder in the second child, and to interventions that prevent or reduce the effects of the harmful antibodies, he said.
"What’s intriguing about this line of work is that in the majority of autism we don’t have a lot of targets for intervention or prevention, whereas if this were to be replicated and pan out, these antibodies are quite easy to identify," he added.
Dr. Amaral said he had no disclosures.
FROM THE ANNUAL MEETING OF THE AMERICAN COLLEGE OF NEUROPSYCHOPHARMACOLOGY
Major Finding: The offspring of Rhesus monkeys injected with autoantibodies found in some mothers of autistic children showed distinctive autistic characteristics.
Data Source: An animal model expansion of a study involving IgG in autism.
Disclosures: No disclosures were available at press time.
Brain Development Disruptions May Explain Sex Differences in Depression, CVD
Sex differences in the risk of developing co-occurring depression and cardiovascular disease in adulthood may result from disruptions during the development of the stress response circuitry in the fetal brain, according to findings from a longitudinal cohort study.
Specifically, fetal exposure to maternal immune activation resulting from preeclampsia or conditions that restricted fetal growth was shown in the 40-year study to be significantly associated the co-occurrence of major depressive disorder and cardiovascular diseases in adult women, Jill M. Goldstein, Ph.D., said during a press briefing held in conjunction with the annual meeting of the American College of Neuropsychopharmacology.
The findings have implications for the development of early interventions in adults who may be predisposed to this common co-occurrence, said Dr. Goldstein, professor of psychiatry and medicine at Harvard Medical School, Boston.
This is particularly important because co-occurring depression and cardiovascular disease, with a prevalence of about 20%, is predicted to be the leading cause of disability worldwide by 2020, she noted.
"Women are at greater risk than men, and we don’t understand why," said Dr. Goldstein, also director of research at the Connors Center for Women’s Health and Gender Biology at Brigham and Women’s Hospital, Boston.
However, given that stress response circuitry involves a "perfect storm" of brain regions with some of the most striking sex differences, which both develop and function differently in men and women, and which regulate mood and cardiac function, Dr. Goldstein and her colleagues hypothesized that disruptions in this circuitry during fetal development might result in the sex-specific vulnerabilities to co-occurring depression and cardiovascular disease later in life.
Indeed, functional magnetic resonance imaging studies in 60 of the 295 adults with fetal exposure to maternal immune activation and their nonexposed siblings in the study cohort showed that fetal exposure was significantly associated with later sex-specific deficits on several measures of stress response brain activity, endocrine function, and autonomic nervous system function.
"Women are at greater risk than men, and we don’t understand why."
For example, the functional MRI scans, conducted in 30 exposed and 30 nonexposed subjects who viewed images with negative valence/high arousal vs. neutral valence/low arousal stimuli to assess stress response, indicated that exposure was significantly associated with low high-frequency R-R interval variability (HF-RRV) in response to stress 40 years later.
The association was threefold greater in those with major depressive disorder than in those without major depressive disorder, and exposed women had a significantly higher risk of comorbidity of major depressive disorder and HF-RRV, compared with men (relative risk 1.38). This comorbidity was significantly associated with tumor necrosis factor–alpha levels in the maternal sera of exposed women, compared with nonexposed women.
Also, significantly greater blood oxygen level-dependent (BOLD) signal changes in the hippocampus, anterior hypothalamus, and anterior cingulate cortex, and hypoactivity in the orbitofrontal cortex occurred in response to stress in fetal-exposed women. Only hypoactivity in the orbitofrontal cortex was significantly increased in exposed men.
Fetal-exposed women with major depressive disorder had the greatest hyperactivity in the anterior hypothalamus, and the greatest hypoactivity in the anterior cingulate cortex, the orbitofrontal cortex, and the hippocampus. This suggests hyperarousal, and lack of cortical and hippocampus control in fetal-exposed women who have major depressive disorder, Dr. Goldstein explained.
Significant associations also were seen between important stress response brain regions and loss of parasympathetic cardiac regulation, she noted.
Higher levels of TNF-alpha/interleukin-10 and interleukin-6 were significantly associated with lower BOLD changes in the hippocampus and anterior cingulate cortex, and these co-occurred with disruption in the hormones that were collected during the MRI scans (and timed to the stress response), including dehydroepiandrosterone or DHEAS, testosterone, estradiol or E2 (which were lowered), and progesterone, and cortisol (which increased) in response to stress.
This study was an expansion of a National Institutes of Health study initiated in the 1950s. Mothers were followed through pregnancy, their sera were stored at the NIH, and offspring were followed for 7 years. For this part of the study, Dr. Goldstein and her colleagues rerecruited exposed offspring and their nonexposed siblings, and followed them for 20 years, into their late 40s.
The findings suggest shared fetal risk factors for the co-occurrence of major depressive disorder and cardiovascular disease risk in women, and they suggest that fetal programming of the stress response circuitry might help explain sex-specific vulnerability to major depressive disorder and cardiovascular disease risk, Dr. Goldstein said.
Understanding the early signals and pathways could lead to early interventions that could lessen disability and perhaps even prevent the illnesses, she concluded.
This study was funded by the National Institute of Mental Health and the NIH Office of Research on Women’s Health. Dr. Goldstein reported no disclosures.
Sex differences in the risk of developing co-occurring depression and cardiovascular disease in adulthood may result from disruptions during the development of the stress response circuitry in the fetal brain, according to findings from a longitudinal cohort study.
Specifically, fetal exposure to maternal immune activation resulting from preeclampsia or conditions that restricted fetal growth was shown in the 40-year study to be significantly associated the co-occurrence of major depressive disorder and cardiovascular diseases in adult women, Jill M. Goldstein, Ph.D., said during a press briefing held in conjunction with the annual meeting of the American College of Neuropsychopharmacology.
The findings have implications for the development of early interventions in adults who may be predisposed to this common co-occurrence, said Dr. Goldstein, professor of psychiatry and medicine at Harvard Medical School, Boston.
This is particularly important because co-occurring depression and cardiovascular disease, with a prevalence of about 20%, is predicted to be the leading cause of disability worldwide by 2020, she noted.
"Women are at greater risk than men, and we don’t understand why," said Dr. Goldstein, also director of research at the Connors Center for Women’s Health and Gender Biology at Brigham and Women’s Hospital, Boston.
However, given that stress response circuitry involves a "perfect storm" of brain regions with some of the most striking sex differences, which both develop and function differently in men and women, and which regulate mood and cardiac function, Dr. Goldstein and her colleagues hypothesized that disruptions in this circuitry during fetal development might result in the sex-specific vulnerabilities to co-occurring depression and cardiovascular disease later in life.
Indeed, functional magnetic resonance imaging studies in 60 of the 295 adults with fetal exposure to maternal immune activation and their nonexposed siblings in the study cohort showed that fetal exposure was significantly associated with later sex-specific deficits on several measures of stress response brain activity, endocrine function, and autonomic nervous system function.
"Women are at greater risk than men, and we don’t understand why."
For example, the functional MRI scans, conducted in 30 exposed and 30 nonexposed subjects who viewed images with negative valence/high arousal vs. neutral valence/low arousal stimuli to assess stress response, indicated that exposure was significantly associated with low high-frequency R-R interval variability (HF-RRV) in response to stress 40 years later.
The association was threefold greater in those with major depressive disorder than in those without major depressive disorder, and exposed women had a significantly higher risk of comorbidity of major depressive disorder and HF-RRV, compared with men (relative risk 1.38). This comorbidity was significantly associated with tumor necrosis factor–alpha levels in the maternal sera of exposed women, compared with nonexposed women.
Also, significantly greater blood oxygen level-dependent (BOLD) signal changes in the hippocampus, anterior hypothalamus, and anterior cingulate cortex, and hypoactivity in the orbitofrontal cortex occurred in response to stress in fetal-exposed women. Only hypoactivity in the orbitofrontal cortex was significantly increased in exposed men.
Fetal-exposed women with major depressive disorder had the greatest hyperactivity in the anterior hypothalamus, and the greatest hypoactivity in the anterior cingulate cortex, the orbitofrontal cortex, and the hippocampus. This suggests hyperarousal, and lack of cortical and hippocampus control in fetal-exposed women who have major depressive disorder, Dr. Goldstein explained.
Significant associations also were seen between important stress response brain regions and loss of parasympathetic cardiac regulation, she noted.
Higher levels of TNF-alpha/interleukin-10 and interleukin-6 were significantly associated with lower BOLD changes in the hippocampus and anterior cingulate cortex, and these co-occurred with disruption in the hormones that were collected during the MRI scans (and timed to the stress response), including dehydroepiandrosterone or DHEAS, testosterone, estradiol or E2 (which were lowered), and progesterone, and cortisol (which increased) in response to stress.
This study was an expansion of a National Institutes of Health study initiated in the 1950s. Mothers were followed through pregnancy, their sera were stored at the NIH, and offspring were followed for 7 years. For this part of the study, Dr. Goldstein and her colleagues rerecruited exposed offspring and their nonexposed siblings, and followed them for 20 years, into their late 40s.
The findings suggest shared fetal risk factors for the co-occurrence of major depressive disorder and cardiovascular disease risk in women, and they suggest that fetal programming of the stress response circuitry might help explain sex-specific vulnerability to major depressive disorder and cardiovascular disease risk, Dr. Goldstein said.
Understanding the early signals and pathways could lead to early interventions that could lessen disability and perhaps even prevent the illnesses, she concluded.
This study was funded by the National Institute of Mental Health and the NIH Office of Research on Women’s Health. Dr. Goldstein reported no disclosures.
Sex differences in the risk of developing co-occurring depression and cardiovascular disease in adulthood may result from disruptions during the development of the stress response circuitry in the fetal brain, according to findings from a longitudinal cohort study.
Specifically, fetal exposure to maternal immune activation resulting from preeclampsia or conditions that restricted fetal growth was shown in the 40-year study to be significantly associated the co-occurrence of major depressive disorder and cardiovascular diseases in adult women, Jill M. Goldstein, Ph.D., said during a press briefing held in conjunction with the annual meeting of the American College of Neuropsychopharmacology.
The findings have implications for the development of early interventions in adults who may be predisposed to this common co-occurrence, said Dr. Goldstein, professor of psychiatry and medicine at Harvard Medical School, Boston.
This is particularly important because co-occurring depression and cardiovascular disease, with a prevalence of about 20%, is predicted to be the leading cause of disability worldwide by 2020, she noted.
"Women are at greater risk than men, and we don’t understand why," said Dr. Goldstein, also director of research at the Connors Center for Women’s Health and Gender Biology at Brigham and Women’s Hospital, Boston.
However, given that stress response circuitry involves a "perfect storm" of brain regions with some of the most striking sex differences, which both develop and function differently in men and women, and which regulate mood and cardiac function, Dr. Goldstein and her colleagues hypothesized that disruptions in this circuitry during fetal development might result in the sex-specific vulnerabilities to co-occurring depression and cardiovascular disease later in life.
Indeed, functional magnetic resonance imaging studies in 60 of the 295 adults with fetal exposure to maternal immune activation and their nonexposed siblings in the study cohort showed that fetal exposure was significantly associated with later sex-specific deficits on several measures of stress response brain activity, endocrine function, and autonomic nervous system function.
"Women are at greater risk than men, and we don’t understand why."
For example, the functional MRI scans, conducted in 30 exposed and 30 nonexposed subjects who viewed images with negative valence/high arousal vs. neutral valence/low arousal stimuli to assess stress response, indicated that exposure was significantly associated with low high-frequency R-R interval variability (HF-RRV) in response to stress 40 years later.
The association was threefold greater in those with major depressive disorder than in those without major depressive disorder, and exposed women had a significantly higher risk of comorbidity of major depressive disorder and HF-RRV, compared with men (relative risk 1.38). This comorbidity was significantly associated with tumor necrosis factor–alpha levels in the maternal sera of exposed women, compared with nonexposed women.
Also, significantly greater blood oxygen level-dependent (BOLD) signal changes in the hippocampus, anterior hypothalamus, and anterior cingulate cortex, and hypoactivity in the orbitofrontal cortex occurred in response to stress in fetal-exposed women. Only hypoactivity in the orbitofrontal cortex was significantly increased in exposed men.
Fetal-exposed women with major depressive disorder had the greatest hyperactivity in the anterior hypothalamus, and the greatest hypoactivity in the anterior cingulate cortex, the orbitofrontal cortex, and the hippocampus. This suggests hyperarousal, and lack of cortical and hippocampus control in fetal-exposed women who have major depressive disorder, Dr. Goldstein explained.
Significant associations also were seen between important stress response brain regions and loss of parasympathetic cardiac regulation, she noted.
Higher levels of TNF-alpha/interleukin-10 and interleukin-6 were significantly associated with lower BOLD changes in the hippocampus and anterior cingulate cortex, and these co-occurred with disruption in the hormones that were collected during the MRI scans (and timed to the stress response), including dehydroepiandrosterone or DHEAS, testosterone, estradiol or E2 (which were lowered), and progesterone, and cortisol (which increased) in response to stress.
This study was an expansion of a National Institutes of Health study initiated in the 1950s. Mothers were followed through pregnancy, their sera were stored at the NIH, and offspring were followed for 7 years. For this part of the study, Dr. Goldstein and her colleagues rerecruited exposed offspring and their nonexposed siblings, and followed them for 20 years, into their late 40s.
The findings suggest shared fetal risk factors for the co-occurrence of major depressive disorder and cardiovascular disease risk in women, and they suggest that fetal programming of the stress response circuitry might help explain sex-specific vulnerability to major depressive disorder and cardiovascular disease risk, Dr. Goldstein said.
Understanding the early signals and pathways could lead to early interventions that could lessen disability and perhaps even prevent the illnesses, she concluded.
This study was funded by the National Institute of Mental Health and the NIH Office of Research on Women’s Health. Dr. Goldstein reported no disclosures.
FROM THE ANNUAL MEETING OF THE AMERICAN COLLEGE OF NEUROPSYCHO-PHARMACOLOGY
Major Finding: Fetal exposure to maternal immune activation resulting from preeclampsia or conditions that restricted fetal growth was shown in the 40-year study to be significantly associated with the co-occurrence of major depressive disorder and cardiovascular disease risk in adult women.
Data Source: An expansion of a longitudinal cohort study.
Disclosures: This study was funded by the National Institute of Mental Health and the National Institutes of Health Office of Research on Women’s Health. Dr. Goldstein reported no disclosures.
AGA Releases New Standards for GIs Performing CT Colonography
Computed tomographic colonography is an acceptable alternative to colonoscopy as a colorectal cancer screening method in average-risk, asymptomatic adults, but gastroenterologists planning to use the technology should undergo more extensive training than previously recommended, according to newly updated standards from the American Gastroenterological Association.
The task force report, "AGA Standards for Gastroenterologists for Performing and Interpreting Diagnostic Computed Tomography Colonography: 2011 Update," is scheduled to appear in the December issue of Gastroenterology.
Perhaps most notable is the AGA’s support of this technology for screening of average-risk asymptomatic adults. This marks a change from 2007, when the original AGA task force statement was published. That statement stopped short of supporting CTC for routine colorectal cancer screening in average-risk asymptomatic patients – the evidence was considered insufficient.
In the update, however, the authors note that "several multidisciplinary groups involved in CRC [colorectal cancer] screening guideline development, including the AGA, have endorsed CT colonography for CRC screening."
Both the 2007 and 2011 versions state that CTC is effective for several specific purposes: evaluation of the colon proximal to an obstructing lesion, and failed colonoscopy with continued need to evaluate the colon. CTC can also be considered in patients who cannot undergo, or choose not to undergo, screening methods other than CTC.
Research conducted in the past few years shows that most of the recent, high-quality published studies on CT colonography indicate that its potential benefits as a primary screening tool in average-risk adults most likely exceed the potential harms, according to the authors, Dr. Brooks D. Cash, AGAF, of Walter Reed National Military Medical Center, Bethesda, Md., and his coauthors.
"Taken as a whole, the updated body of literature examining screening CT colonography demonstrates sensitivity for CRC and adenomas greater than or equal to 6 mm that approaches that of colonoscopy and is superior to results obtained with other methods of screening," they wrote.
Although data on sensitivity of CTC for larger lesions have been consistent, mixed results have been found for smaller lesions, leading to some controversy. The AGA joins groups including the American Cancer Society and the Blue Cross/Blue Shield Technology Evaluation Center, in their support of CTC use for primary screening. CTC has also been endorsed as a primary CRC screening test for average-risk, asymptomatic adults by a multidisciplinary group that includes members of the American Cancer Society, the U.S. Multi-Society Task Force on CRC, and the American College of Radiology.
As for training of gastroenterologists who wish to perform CTC, the AGA standards continue to call for initial training that includes review and interpretation of at least 75 cases with endoscopic correlation, followed by a 6-month preceptorship involving interpretation of at least 150 cases. The 2007 standards had called for a 4- to 6-week preceptorship involving interpretation of at least 25-50 additional cases.
The preceptorship may consist of training with a mentor in person or it "may take place in any of several scenarios (e.g., web-based mentoring)," and the updated standards also call for mastery of new manual skills and didactic information through hands-on individualized instruction by experienced faculty.
"Gastroenterologists training to interpret CT colonography should operate the workstation and review cases in an interactive fashion, whereby they are responsible for the manipulation of the data set. Training requires involvement in the acquisition of studies, which means that at least some of the training must be done at a busy imaging center," they wrote.
The updated standards also note the need for ongoing training and self-assessment, to include formal continuing medical education accredited courses in CT colonography, and the task force emphasizes the importance of collaboration with board-certified radiologists to review extracolonic portions of the CTC exam.
After CTC results are obtained and interpreted, patients with any polyp that is 6 mm or larger, or three or more polyps of any size (in the setting of high diagnostic confidence), need to be referred for consideration of endoscopic polypectomy. The appropriate clinical management of patients with one or two lesions that are 5 mm or smaller is unknown, but such lesions should be reported when diagnostic confidence is high.
Other additions to the updated guidelines include a strong recommendation for participation in the ACR National Radiology Data Registry’s CT Colonography Registry and the AGA Digestive Health Outcomes Registry, and a reminder that gastroenterologists who provide CT or other advanced imaging services, and who bill under Part B of the Medicare Physician Fee Schedule, must be accredited by Jan. 1, 2012, to receive Medicare payment for the technical components of these services.
"Several multidisciplinary groups involved in CRC screening guideline development ... have endorsed CT colonography for CRC screening."
The AGA standards were updated in light of the increasing uniformity of findings regarding the sensitivity of CTC, which has led to a greater acceptance of the technology for clinical practice, according to the authors. Attention paid to this technology has also increased because of the relatively low (50%-55%) compliance with population-wide CRC screening recommendations.
Multiple reasons for the low compliance rates are cited, and chief among them is the inconvenient, uncomfortable, and invasive nature of other, more widely accepted screening modalities including flexible sigmoidoscopy and colonoscopy, they noted.
CT colonography is an attractive option because of its simpler, noninvasive nature, and although the method requires a preprocedure bowel purge, the possibility of performing CTC without a bowel purge in the future is a promising area of research.
From the patient’s perspective, a sensitive screening test that does not require bowel purge would likely further increase interest in CT colonography, Dr. Barbara Jung, assistant professor of medicine at Northwestern University, Chicago, said in an interview.
Although she does not use CT colonography, Dr. Jung said she has seen an increase in patient requests, and suspects interest will continue to increase among some gastroenterologists to perform the test, particularly as more data become available. The required training is substantial, however, and at this time could be a deterrent for those who are "pretty busy doing what they’re already doing," she said, adding that its use will be institution dependent based on the local expertise and availability.
For board-certified gastroenterologists who are interested in performing CTC, the updated standards "describe in depth" the necessary skills, training, and other requirements for appropriate use and interpretation, Dr. Jung added.
The authors of the updated standards reported having no disclosures.
Computed tomographic colonography is an acceptable alternative to colonoscopy as a colorectal cancer screening method in average-risk, asymptomatic adults, but gastroenterologists planning to use the technology should undergo more extensive training than previously recommended, according to newly updated standards from the American Gastroenterological Association.
The task force report, "AGA Standards for Gastroenterologists for Performing and Interpreting Diagnostic Computed Tomography Colonography: 2011 Update," is scheduled to appear in the December issue of Gastroenterology.
Perhaps most notable is the AGA’s support of this technology for screening of average-risk asymptomatic adults. This marks a change from 2007, when the original AGA task force statement was published. That statement stopped short of supporting CTC for routine colorectal cancer screening in average-risk asymptomatic patients – the evidence was considered insufficient.
In the update, however, the authors note that "several multidisciplinary groups involved in CRC [colorectal cancer] screening guideline development, including the AGA, have endorsed CT colonography for CRC screening."
Both the 2007 and 2011 versions state that CTC is effective for several specific purposes: evaluation of the colon proximal to an obstructing lesion, and failed colonoscopy with continued need to evaluate the colon. CTC can also be considered in patients who cannot undergo, or choose not to undergo, screening methods other than CTC.
Research conducted in the past few years shows that most of the recent, high-quality published studies on CT colonography indicate that its potential benefits as a primary screening tool in average-risk adults most likely exceed the potential harms, according to the authors, Dr. Brooks D. Cash, AGAF, of Walter Reed National Military Medical Center, Bethesda, Md., and his coauthors.
"Taken as a whole, the updated body of literature examining screening CT colonography demonstrates sensitivity for CRC and adenomas greater than or equal to 6 mm that approaches that of colonoscopy and is superior to results obtained with other methods of screening," they wrote.
Although data on sensitivity of CTC for larger lesions have been consistent, mixed results have been found for smaller lesions, leading to some controversy. The AGA joins groups including the American Cancer Society and the Blue Cross/Blue Shield Technology Evaluation Center, in their support of CTC use for primary screening. CTC has also been endorsed as a primary CRC screening test for average-risk, asymptomatic adults by a multidisciplinary group that includes members of the American Cancer Society, the U.S. Multi-Society Task Force on CRC, and the American College of Radiology.
As for training of gastroenterologists who wish to perform CTC, the AGA standards continue to call for initial training that includes review and interpretation of at least 75 cases with endoscopic correlation, followed by a 6-month preceptorship involving interpretation of at least 150 cases. The 2007 standards had called for a 4- to 6-week preceptorship involving interpretation of at least 25-50 additional cases.
The preceptorship may consist of training with a mentor in person or it "may take place in any of several scenarios (e.g., web-based mentoring)," and the updated standards also call for mastery of new manual skills and didactic information through hands-on individualized instruction by experienced faculty.
"Gastroenterologists training to interpret CT colonography should operate the workstation and review cases in an interactive fashion, whereby they are responsible for the manipulation of the data set. Training requires involvement in the acquisition of studies, which means that at least some of the training must be done at a busy imaging center," they wrote.
The updated standards also note the need for ongoing training and self-assessment, to include formal continuing medical education accredited courses in CT colonography, and the task force emphasizes the importance of collaboration with board-certified radiologists to review extracolonic portions of the CTC exam.
After CTC results are obtained and interpreted, patients with any polyp that is 6 mm or larger, or three or more polyps of any size (in the setting of high diagnostic confidence), need to be referred for consideration of endoscopic polypectomy. The appropriate clinical management of patients with one or two lesions that are 5 mm or smaller is unknown, but such lesions should be reported when diagnostic confidence is high.
Other additions to the updated guidelines include a strong recommendation for participation in the ACR National Radiology Data Registry’s CT Colonography Registry and the AGA Digestive Health Outcomes Registry, and a reminder that gastroenterologists who provide CT or other advanced imaging services, and who bill under Part B of the Medicare Physician Fee Schedule, must be accredited by Jan. 1, 2012, to receive Medicare payment for the technical components of these services.
"Several multidisciplinary groups involved in CRC screening guideline development ... have endorsed CT colonography for CRC screening."
The AGA standards were updated in light of the increasing uniformity of findings regarding the sensitivity of CTC, which has led to a greater acceptance of the technology for clinical practice, according to the authors. Attention paid to this technology has also increased because of the relatively low (50%-55%) compliance with population-wide CRC screening recommendations.
Multiple reasons for the low compliance rates are cited, and chief among them is the inconvenient, uncomfortable, and invasive nature of other, more widely accepted screening modalities including flexible sigmoidoscopy and colonoscopy, they noted.
CT colonography is an attractive option because of its simpler, noninvasive nature, and although the method requires a preprocedure bowel purge, the possibility of performing CTC without a bowel purge in the future is a promising area of research.
From the patient’s perspective, a sensitive screening test that does not require bowel purge would likely further increase interest in CT colonography, Dr. Barbara Jung, assistant professor of medicine at Northwestern University, Chicago, said in an interview.
Although she does not use CT colonography, Dr. Jung said she has seen an increase in patient requests, and suspects interest will continue to increase among some gastroenterologists to perform the test, particularly as more data become available. The required training is substantial, however, and at this time could be a deterrent for those who are "pretty busy doing what they’re already doing," she said, adding that its use will be institution dependent based on the local expertise and availability.
For board-certified gastroenterologists who are interested in performing CTC, the updated standards "describe in depth" the necessary skills, training, and other requirements for appropriate use and interpretation, Dr. Jung added.
The authors of the updated standards reported having no disclosures.
Computed tomographic colonography is an acceptable alternative to colonoscopy as a colorectal cancer screening method in average-risk, asymptomatic adults, but gastroenterologists planning to use the technology should undergo more extensive training than previously recommended, according to newly updated standards from the American Gastroenterological Association.
The task force report, "AGA Standards for Gastroenterologists for Performing and Interpreting Diagnostic Computed Tomography Colonography: 2011 Update," is scheduled to appear in the December issue of Gastroenterology.
Perhaps most notable is the AGA’s support of this technology for screening of average-risk asymptomatic adults. This marks a change from 2007, when the original AGA task force statement was published. That statement stopped short of supporting CTC for routine colorectal cancer screening in average-risk asymptomatic patients – the evidence was considered insufficient.
In the update, however, the authors note that "several multidisciplinary groups involved in CRC [colorectal cancer] screening guideline development, including the AGA, have endorsed CT colonography for CRC screening."
Both the 2007 and 2011 versions state that CTC is effective for several specific purposes: evaluation of the colon proximal to an obstructing lesion, and failed colonoscopy with continued need to evaluate the colon. CTC can also be considered in patients who cannot undergo, or choose not to undergo, screening methods other than CTC.
Research conducted in the past few years shows that most of the recent, high-quality published studies on CT colonography indicate that its potential benefits as a primary screening tool in average-risk adults most likely exceed the potential harms, according to the authors, Dr. Brooks D. Cash, AGAF, of Walter Reed National Military Medical Center, Bethesda, Md., and his coauthors.
"Taken as a whole, the updated body of literature examining screening CT colonography demonstrates sensitivity for CRC and adenomas greater than or equal to 6 mm that approaches that of colonoscopy and is superior to results obtained with other methods of screening," they wrote.
Although data on sensitivity of CTC for larger lesions have been consistent, mixed results have been found for smaller lesions, leading to some controversy. The AGA joins groups including the American Cancer Society and the Blue Cross/Blue Shield Technology Evaluation Center, in their support of CTC use for primary screening. CTC has also been endorsed as a primary CRC screening test for average-risk, asymptomatic adults by a multidisciplinary group that includes members of the American Cancer Society, the U.S. Multi-Society Task Force on CRC, and the American College of Radiology.
As for training of gastroenterologists who wish to perform CTC, the AGA standards continue to call for initial training that includes review and interpretation of at least 75 cases with endoscopic correlation, followed by a 6-month preceptorship involving interpretation of at least 150 cases. The 2007 standards had called for a 4- to 6-week preceptorship involving interpretation of at least 25-50 additional cases.
The preceptorship may consist of training with a mentor in person or it "may take place in any of several scenarios (e.g., web-based mentoring)," and the updated standards also call for mastery of new manual skills and didactic information through hands-on individualized instruction by experienced faculty.
"Gastroenterologists training to interpret CT colonography should operate the workstation and review cases in an interactive fashion, whereby they are responsible for the manipulation of the data set. Training requires involvement in the acquisition of studies, which means that at least some of the training must be done at a busy imaging center," they wrote.
The updated standards also note the need for ongoing training and self-assessment, to include formal continuing medical education accredited courses in CT colonography, and the task force emphasizes the importance of collaboration with board-certified radiologists to review extracolonic portions of the CTC exam.
After CTC results are obtained and interpreted, patients with any polyp that is 6 mm or larger, or three or more polyps of any size (in the setting of high diagnostic confidence), need to be referred for consideration of endoscopic polypectomy. The appropriate clinical management of patients with one or two lesions that are 5 mm or smaller is unknown, but such lesions should be reported when diagnostic confidence is high.
Other additions to the updated guidelines include a strong recommendation for participation in the ACR National Radiology Data Registry’s CT Colonography Registry and the AGA Digestive Health Outcomes Registry, and a reminder that gastroenterologists who provide CT or other advanced imaging services, and who bill under Part B of the Medicare Physician Fee Schedule, must be accredited by Jan. 1, 2012, to receive Medicare payment for the technical components of these services.
"Several multidisciplinary groups involved in CRC screening guideline development ... have endorsed CT colonography for CRC screening."
The AGA standards were updated in light of the increasing uniformity of findings regarding the sensitivity of CTC, which has led to a greater acceptance of the technology for clinical practice, according to the authors. Attention paid to this technology has also increased because of the relatively low (50%-55%) compliance with population-wide CRC screening recommendations.
Multiple reasons for the low compliance rates are cited, and chief among them is the inconvenient, uncomfortable, and invasive nature of other, more widely accepted screening modalities including flexible sigmoidoscopy and colonoscopy, they noted.
CT colonography is an attractive option because of its simpler, noninvasive nature, and although the method requires a preprocedure bowel purge, the possibility of performing CTC without a bowel purge in the future is a promising area of research.
From the patient’s perspective, a sensitive screening test that does not require bowel purge would likely further increase interest in CT colonography, Dr. Barbara Jung, assistant professor of medicine at Northwestern University, Chicago, said in an interview.
Although she does not use CT colonography, Dr. Jung said she has seen an increase in patient requests, and suspects interest will continue to increase among some gastroenterologists to perform the test, particularly as more data become available. The required training is substantial, however, and at this time could be a deterrent for those who are "pretty busy doing what they’re already doing," she said, adding that its use will be institution dependent based on the local expertise and availability.
For board-certified gastroenterologists who are interested in performing CTC, the updated standards "describe in depth" the necessary skills, training, and other requirements for appropriate use and interpretation, Dr. Jung added.
The authors of the updated standards reported having no disclosures.
FROM GASTROENTEROLOGY
Foley Catheter Bests Prostaglandin E2 Gel
Use of a Foley catheter for labor induction in women with an unfavorable cervix at term resulted in a similar cesarean section rate to use of prostaglandin E2 gel for labor induction, but with fewer maternal and neonatal side effects, according to findings from an open-label randomized controlled trial involving more than 800 women.
Cesarean section, most often done for failure to progress during the first stage of labor, was performed in 23% of 411 women induced using a Foley catheter, and in 20% of 408 women induced using vaginal prostaglandin E2 gel (relative risk, 1.13), Dr. Marta Jozwiak of Groene Hart Hospital, Gouda, the Netherlands, and her colleagues from the PROBAAT Study Group reported.
The frequency of vaginal instrumental deliveries was similar in the two groups as well (11% and 13% in the Foley catheter and prostaglandin E2 groups, respectively), the investigators said (Lancet 2011 [doi:10.1016/S0140-6736(11)61484-0]).
Two serious maternal adverse events occurred, both in the prostaglandin group. These included one uterine perforation after insertion of an intrauterine pressure catheter and one uterine rupture during oxytocin augmentation, they said, noting that both neonates were born in good clinical condition, but were admitted to the neonatal ward with suspected infection.
Four minor maternal adverse events occurred, including three allergic reactions (in one woman in the Foley catheter group and in two women in the prostaglandin E2 group), and blood loss on a second insertion of the catheter in one woman. Also, the rate of suspected intrapartum infection was significantly lower in the Foley catheter group (1% vs. 3%).
As for neonatal adverse events, the rate of admissions to the neonatal ward was significantly higher in the prostaglandin E2 group (12% vs. 20%), Dr. Jozwiak and her associates said.
Patients in the PROBAAT trial included women beyond 37 weeks' gestation with a singleton pregnancy in cephalic presentation, with intact membranes and an unfavorable cervix as defined by a Bishop score less than 6. The women were enrolled at 12 centers throughout the Netherlands between Feb. 10, 2009, and May 17, 2010, and all had an indication for labor induction, and had no prior cesarean sections.
The findings in regard to cesarean section rates when labor induction is performed using a Foley catheter vs. prostaglandin E2 were confirmed by a meta-analysis that included data from this study. The meta-analysis also demonstrated that Foley catheter induction was associated with reduced rates of hyperstimulation (odds ratio, 0.44) and postpartum hemorrhage (OR, 0.60), the investigators reported, noting that although the use of a Foley catheter for labor induction did not increase the vaginal delivery rate when compared with use of prostaglandin E2 as they had hypothesized, the findings nonetheless support the use of Foley catheters.
“After induction with a Foley catheter, the overall number of operative deliveries for suspected fetal distress was lower, fewer mothers were treated with intrapartum antibiotics, and significantly fewer neonates were admitted to the neonatal ward,” they said, adding: “We therefore think that a Foley catheter should be considered for induction of labor in women with an unfavorable cervix at term.”
Also, in light of the low cost and easy storage of Foley catheters, their use could be suitable for developing countries and low-resource settings, Dr. Jozwiak and her associates noted.
Future studies should compare the use of Foley catheters with other prostaglandin preparations such as prostaglandin E1 (misoprostol) which is becoming increasingly popular worldwide, and should evaluate the use of Foley catheters in women with a prior cesarean section, they suggested.
Dr. Jane E.
Norman and Dr. Sarah Stock note that the findings that intracervical
placement of a Foley catheter induces cervical ripening without inducing
uterine contractions and is as successful as prostaglandin for inducing
labor, could have important implications for women with a prior
cesarean section.
In these women, induction with prostaglandins is
associated with uterine rupture, they noted, explaining that
prostaglandins affect both cervical ripening and contractions
simultaneously, whereas the ideal strategy for induction is likely
administration of a cervical ripening agent before stimulation of
contractions. This, they argue, would decrease the need for fetal
monitoring during ripening and reduce the risk of uterine rupture.
“Although
women with a previous cesarean section were excluded from Jozwiak and
colleagues' study, a Foley catheter could be the ideal induction agent
in this situation and should be assessed further in randomized trials,”
they said, adding that if such trials are done, avoidance of maternal
and neonatal mortality and morbidity should be included as primary
outcome measures, as they are “arguably as important as speed and
avoidance of cesarean section.”
DR. NORMAN and DR. STOCK are
with the MRC Centre for Reproductive Health and the Queen's Medical
Research Center at the University of Edinburgh. Their comments were
taken from an editorial accompanying the article by Dr. Jozwiak and
colleagues (Lancet 2011 [doi:10.1016/S0140-6736(11)61581-X]). Dr. Norman
reported receiving fees for acting as a consultant for Preglem and is
an unpaid member of an advisory board for Hologic. Dr. Stock had no
relevant financial disclosures.
Dr. Jane E.
Norman and Dr. Sarah Stock note that the findings that intracervical
placement of a Foley catheter induces cervical ripening without inducing
uterine contractions and is as successful as prostaglandin for inducing
labor, could have important implications for women with a prior
cesarean section.
In these women, induction with prostaglandins is
associated with uterine rupture, they noted, explaining that
prostaglandins affect both cervical ripening and contractions
simultaneously, whereas the ideal strategy for induction is likely
administration of a cervical ripening agent before stimulation of
contractions. This, they argue, would decrease the need for fetal
monitoring during ripening and reduce the risk of uterine rupture.
“Although
women with a previous cesarean section were excluded from Jozwiak and
colleagues' study, a Foley catheter could be the ideal induction agent
in this situation and should be assessed further in randomized trials,”
they said, adding that if such trials are done, avoidance of maternal
and neonatal mortality and morbidity should be included as primary
outcome measures, as they are “arguably as important as speed and
avoidance of cesarean section.”
DR. NORMAN and DR. STOCK are
with the MRC Centre for Reproductive Health and the Queen's Medical
Research Center at the University of Edinburgh. Their comments were
taken from an editorial accompanying the article by Dr. Jozwiak and
colleagues (Lancet 2011 [doi:10.1016/S0140-6736(11)61581-X]). Dr. Norman
reported receiving fees for acting as a consultant for Preglem and is
an unpaid member of an advisory board for Hologic. Dr. Stock had no
relevant financial disclosures.
Dr. Jane E.
Norman and Dr. Sarah Stock note that the findings that intracervical
placement of a Foley catheter induces cervical ripening without inducing
uterine contractions and is as successful as prostaglandin for inducing
labor, could have important implications for women with a prior
cesarean section.
In these women, induction with prostaglandins is
associated with uterine rupture, they noted, explaining that
prostaglandins affect both cervical ripening and contractions
simultaneously, whereas the ideal strategy for induction is likely
administration of a cervical ripening agent before stimulation of
contractions. This, they argue, would decrease the need for fetal
monitoring during ripening and reduce the risk of uterine rupture.
“Although
women with a previous cesarean section were excluded from Jozwiak and
colleagues' study, a Foley catheter could be the ideal induction agent
in this situation and should be assessed further in randomized trials,”
they said, adding that if such trials are done, avoidance of maternal
and neonatal mortality and morbidity should be included as primary
outcome measures, as they are “arguably as important as speed and
avoidance of cesarean section.”
DR. NORMAN and DR. STOCK are
with the MRC Centre for Reproductive Health and the Queen's Medical
Research Center at the University of Edinburgh. Their comments were
taken from an editorial accompanying the article by Dr. Jozwiak and
colleagues (Lancet 2011 [doi:10.1016/S0140-6736(11)61581-X]). Dr. Norman
reported receiving fees for acting as a consultant for Preglem and is
an unpaid member of an advisory board for Hologic. Dr. Stock had no
relevant financial disclosures.
Use of a Foley catheter for labor induction in women with an unfavorable cervix at term resulted in a similar cesarean section rate to use of prostaglandin E2 gel for labor induction, but with fewer maternal and neonatal side effects, according to findings from an open-label randomized controlled trial involving more than 800 women.
Cesarean section, most often done for failure to progress during the first stage of labor, was performed in 23% of 411 women induced using a Foley catheter, and in 20% of 408 women induced using vaginal prostaglandin E2 gel (relative risk, 1.13), Dr. Marta Jozwiak of Groene Hart Hospital, Gouda, the Netherlands, and her colleagues from the PROBAAT Study Group reported.
The frequency of vaginal instrumental deliveries was similar in the two groups as well (11% and 13% in the Foley catheter and prostaglandin E2 groups, respectively), the investigators said (Lancet 2011 [doi:10.1016/S0140-6736(11)61484-0]).
Two serious maternal adverse events occurred, both in the prostaglandin group. These included one uterine perforation after insertion of an intrauterine pressure catheter and one uterine rupture during oxytocin augmentation, they said, noting that both neonates were born in good clinical condition, but were admitted to the neonatal ward with suspected infection.
Four minor maternal adverse events occurred, including three allergic reactions (in one woman in the Foley catheter group and in two women in the prostaglandin E2 group), and blood loss on a second insertion of the catheter in one woman. Also, the rate of suspected intrapartum infection was significantly lower in the Foley catheter group (1% vs. 3%).
As for neonatal adverse events, the rate of admissions to the neonatal ward was significantly higher in the prostaglandin E2 group (12% vs. 20%), Dr. Jozwiak and her associates said.
Patients in the PROBAAT trial included women beyond 37 weeks' gestation with a singleton pregnancy in cephalic presentation, with intact membranes and an unfavorable cervix as defined by a Bishop score less than 6. The women were enrolled at 12 centers throughout the Netherlands between Feb. 10, 2009, and May 17, 2010, and all had an indication for labor induction, and had no prior cesarean sections.
The findings in regard to cesarean section rates when labor induction is performed using a Foley catheter vs. prostaglandin E2 were confirmed by a meta-analysis that included data from this study. The meta-analysis also demonstrated that Foley catheter induction was associated with reduced rates of hyperstimulation (odds ratio, 0.44) and postpartum hemorrhage (OR, 0.60), the investigators reported, noting that although the use of a Foley catheter for labor induction did not increase the vaginal delivery rate when compared with use of prostaglandin E2 as they had hypothesized, the findings nonetheless support the use of Foley catheters.
“After induction with a Foley catheter, the overall number of operative deliveries for suspected fetal distress was lower, fewer mothers were treated with intrapartum antibiotics, and significantly fewer neonates were admitted to the neonatal ward,” they said, adding: “We therefore think that a Foley catheter should be considered for induction of labor in women with an unfavorable cervix at term.”
Also, in light of the low cost and easy storage of Foley catheters, their use could be suitable for developing countries and low-resource settings, Dr. Jozwiak and her associates noted.
Future studies should compare the use of Foley catheters with other prostaglandin preparations such as prostaglandin E1 (misoprostol) which is becoming increasingly popular worldwide, and should evaluate the use of Foley catheters in women with a prior cesarean section, they suggested.
Use of a Foley catheter for labor induction in women with an unfavorable cervix at term resulted in a similar cesarean section rate to use of prostaglandin E2 gel for labor induction, but with fewer maternal and neonatal side effects, according to findings from an open-label randomized controlled trial involving more than 800 women.
Cesarean section, most often done for failure to progress during the first stage of labor, was performed in 23% of 411 women induced using a Foley catheter, and in 20% of 408 women induced using vaginal prostaglandin E2 gel (relative risk, 1.13), Dr. Marta Jozwiak of Groene Hart Hospital, Gouda, the Netherlands, and her colleagues from the PROBAAT Study Group reported.
The frequency of vaginal instrumental deliveries was similar in the two groups as well (11% and 13% in the Foley catheter and prostaglandin E2 groups, respectively), the investigators said (Lancet 2011 [doi:10.1016/S0140-6736(11)61484-0]).
Two serious maternal adverse events occurred, both in the prostaglandin group. These included one uterine perforation after insertion of an intrauterine pressure catheter and one uterine rupture during oxytocin augmentation, they said, noting that both neonates were born in good clinical condition, but were admitted to the neonatal ward with suspected infection.
Four minor maternal adverse events occurred, including three allergic reactions (in one woman in the Foley catheter group and in two women in the prostaglandin E2 group), and blood loss on a second insertion of the catheter in one woman. Also, the rate of suspected intrapartum infection was significantly lower in the Foley catheter group (1% vs. 3%).
As for neonatal adverse events, the rate of admissions to the neonatal ward was significantly higher in the prostaglandin E2 group (12% vs. 20%), Dr. Jozwiak and her associates said.
Patients in the PROBAAT trial included women beyond 37 weeks' gestation with a singleton pregnancy in cephalic presentation, with intact membranes and an unfavorable cervix as defined by a Bishop score less than 6. The women were enrolled at 12 centers throughout the Netherlands between Feb. 10, 2009, and May 17, 2010, and all had an indication for labor induction, and had no prior cesarean sections.
The findings in regard to cesarean section rates when labor induction is performed using a Foley catheter vs. prostaglandin E2 were confirmed by a meta-analysis that included data from this study. The meta-analysis also demonstrated that Foley catheter induction was associated with reduced rates of hyperstimulation (odds ratio, 0.44) and postpartum hemorrhage (OR, 0.60), the investigators reported, noting that although the use of a Foley catheter for labor induction did not increase the vaginal delivery rate when compared with use of prostaglandin E2 as they had hypothesized, the findings nonetheless support the use of Foley catheters.
“After induction with a Foley catheter, the overall number of operative deliveries for suspected fetal distress was lower, fewer mothers were treated with intrapartum antibiotics, and significantly fewer neonates were admitted to the neonatal ward,” they said, adding: “We therefore think that a Foley catheter should be considered for induction of labor in women with an unfavorable cervix at term.”
Also, in light of the low cost and easy storage of Foley catheters, their use could be suitable for developing countries and low-resource settings, Dr. Jozwiak and her associates noted.
Future studies should compare the use of Foley catheters with other prostaglandin preparations such as prostaglandin E1 (misoprostol) which is becoming increasingly popular worldwide, and should evaluate the use of Foley catheters in women with a prior cesarean section, they suggested.
From the Lancet
RA: Adalimumab, MTX Induction Yields Sustained Response
CHICAGO – Combination induction therapy with adalimumab and methotrexate was superior to methotrexate alone for the treatment of patients with early rheumatoid arthritis in the randomized, controlled, multicenter HIT HARD trial.
The mean 28-joint count disease activity score (DAS28) was reduced from 6.2 to 3.0 after a 24-week induction phase in 87 RA patients randomized to receive combination therapy, which was significantly greater than the reduction from 6.3 to 3.5 in 85 patients randomized to receive methotrexate monotherapy for 24 weeks, according to Dr. Gerd R. Burmester reported at the annual meeting of the American College of Rheumatology.
After 24 weeks, adalimumab was withdrawn in the patients on combination therapy, and both groups received only methotrexate for an additional 24 weeks. The 48-week DAS28 – the primary end point of the double-blind study – remained numerically, but not significantly, lower in the group who initially received combination therapy, compared with the group who received initial monotherapy (3.2 vs. 3.4, respectively), said Dr. Burmester, professor of rheumatology at the Charité-Universitatsmedizin Berlin.
As with the primary outcome measure, the differences between the combination therapy and methotrexate monotherapy groups on secondary outcome measures, including remission (defined as DAS28 less than 2.6), ACR50 and ACR70 response rates (indicating 50% and 70% improvement in the signs and symptoms of disease), and functional status as measured by the Health Assessment Questionnaire (HAQ) differed significantly at week 24, but not at week 48.
For example, 47% vs. 31% of patients in the combination and monotherapy groups, respectively, achieved remission by week 24, but those percentages changed to 44% and 37% by week 48. Also, HAQ scores, which were 1.4 and 1.3 at baseline, were 0.49 vs. 0.72 at 24 weeks, compared with 0.60 and 0.65 at 48 weeks.
However, despite the lack of a significant difference in the DAS28 scores and the secondary outcome measures between the two groups at 48-week follow-up, there was a significant difference in regard to radiographic progression of disease: mean total van der Heijde/Sharp scores at that time were 6.3 and 11.4 in the groups, respectively, Dr. Burmester said. The extent to which this difference is clinically meaningful remains unclear, he noted.
Patients in the study had early RA with duration of no more than 1 year (with 87% having disease duration of 3 months or less) and had not been treated previously with disease-modifying antirheumatic drugs. All had active disease, swollen and tender joint counts of at least six, and C-reactive protein levels of at least 10 mg/L. Those in the combination therapy group received 15 mg/week of subcutaneous methotrexate plus 40 mg of adalimumab every other week; monotherapy patients received 15 mg/week of methotrexate plus placebo, and no new safety signals were detected during the course of the study.
The findings suggest that combination therapy with adalimumab and methotrexate is significantly superior to methotrexate alone for induction therapy in early RA. Combination therapy produces sustained benefits in regard to radiographic progression at 48 weeks, even when adalimumab is withdrawn at 24 weeks.
"The numerical increase in the clinical outcome parameters of the group on combination therapy, which became apparent at around 40 weeks following treatment initiation, may reflect the loss of response after removal of adalimumab and is a finding that requires additional study, Dr. Burmester noted.
In an unrelated study also presented at ACR 2011, Dr. Arthur F. Kavanaugh, professor of medicine, and director of the center for innovative therapy in the division of rheumatology, allergy, and immunology at the University of California at San Diego and his colleagues similarly found that patients on combination adalimumab and methotrexate who achieved stable low-disease activity were able to maintain that response following adalimumab withdrawal in the OPTIMA (Optimal Protocol for Methotrexate and Adalimumab Combination Therapy in Early RA) Trial. But they also found that those patients with a low HAQ score at baseline were more likely than were other patients to fare well after adalimumab withdrawal, a finding that suggests that those with higher scores might do better if combination therapy is continued.
Still, this question remains, according to Dr. Burmester: "Is it medically or economically justifiable to use this [combination] induction therapy, which rapidly leads to improvement, or should we just wait for the success of methotrexate later?"
Another question that warrants study is whether higher doses of methotrexate would be better, he said, adding, "But there is one very important message: Treat as early as possible."
This study was funded by the German Ministry of Science. Dr. Burmester disclosed financial and other relationships with Abbott, MSD, Pfizer, and UCB. Other study authors also disclosed relationships with these companies and several others, including Bristol-Myers Squibb, Horizon Pharma, Merck Pharma, Merck Serono, Nitec Pharma, Novartis, Roche, and Wyeth Pharmaceuticals.
CHICAGO – Combination induction therapy with adalimumab and methotrexate was superior to methotrexate alone for the treatment of patients with early rheumatoid arthritis in the randomized, controlled, multicenter HIT HARD trial.
The mean 28-joint count disease activity score (DAS28) was reduced from 6.2 to 3.0 after a 24-week induction phase in 87 RA patients randomized to receive combination therapy, which was significantly greater than the reduction from 6.3 to 3.5 in 85 patients randomized to receive methotrexate monotherapy for 24 weeks, according to Dr. Gerd R. Burmester reported at the annual meeting of the American College of Rheumatology.
After 24 weeks, adalimumab was withdrawn in the patients on combination therapy, and both groups received only methotrexate for an additional 24 weeks. The 48-week DAS28 – the primary end point of the double-blind study – remained numerically, but not significantly, lower in the group who initially received combination therapy, compared with the group who received initial monotherapy (3.2 vs. 3.4, respectively), said Dr. Burmester, professor of rheumatology at the Charité-Universitatsmedizin Berlin.
As with the primary outcome measure, the differences between the combination therapy and methotrexate monotherapy groups on secondary outcome measures, including remission (defined as DAS28 less than 2.6), ACR50 and ACR70 response rates (indicating 50% and 70% improvement in the signs and symptoms of disease), and functional status as measured by the Health Assessment Questionnaire (HAQ) differed significantly at week 24, but not at week 48.
For example, 47% vs. 31% of patients in the combination and monotherapy groups, respectively, achieved remission by week 24, but those percentages changed to 44% and 37% by week 48. Also, HAQ scores, which were 1.4 and 1.3 at baseline, were 0.49 vs. 0.72 at 24 weeks, compared with 0.60 and 0.65 at 48 weeks.
However, despite the lack of a significant difference in the DAS28 scores and the secondary outcome measures between the two groups at 48-week follow-up, there was a significant difference in regard to radiographic progression of disease: mean total van der Heijde/Sharp scores at that time were 6.3 and 11.4 in the groups, respectively, Dr. Burmester said. The extent to which this difference is clinically meaningful remains unclear, he noted.
Patients in the study had early RA with duration of no more than 1 year (with 87% having disease duration of 3 months or less) and had not been treated previously with disease-modifying antirheumatic drugs. All had active disease, swollen and tender joint counts of at least six, and C-reactive protein levels of at least 10 mg/L. Those in the combination therapy group received 15 mg/week of subcutaneous methotrexate plus 40 mg of adalimumab every other week; monotherapy patients received 15 mg/week of methotrexate plus placebo, and no new safety signals were detected during the course of the study.
The findings suggest that combination therapy with adalimumab and methotrexate is significantly superior to methotrexate alone for induction therapy in early RA. Combination therapy produces sustained benefits in regard to radiographic progression at 48 weeks, even when adalimumab is withdrawn at 24 weeks.
"The numerical increase in the clinical outcome parameters of the group on combination therapy, which became apparent at around 40 weeks following treatment initiation, may reflect the loss of response after removal of adalimumab and is a finding that requires additional study, Dr. Burmester noted.
In an unrelated study also presented at ACR 2011, Dr. Arthur F. Kavanaugh, professor of medicine, and director of the center for innovative therapy in the division of rheumatology, allergy, and immunology at the University of California at San Diego and his colleagues similarly found that patients on combination adalimumab and methotrexate who achieved stable low-disease activity were able to maintain that response following adalimumab withdrawal in the OPTIMA (Optimal Protocol for Methotrexate and Adalimumab Combination Therapy in Early RA) Trial. But they also found that those patients with a low HAQ score at baseline were more likely than were other patients to fare well after adalimumab withdrawal, a finding that suggests that those with higher scores might do better if combination therapy is continued.
Still, this question remains, according to Dr. Burmester: "Is it medically or economically justifiable to use this [combination] induction therapy, which rapidly leads to improvement, or should we just wait for the success of methotrexate later?"
Another question that warrants study is whether higher doses of methotrexate would be better, he said, adding, "But there is one very important message: Treat as early as possible."
This study was funded by the German Ministry of Science. Dr. Burmester disclosed financial and other relationships with Abbott, MSD, Pfizer, and UCB. Other study authors also disclosed relationships with these companies and several others, including Bristol-Myers Squibb, Horizon Pharma, Merck Pharma, Merck Serono, Nitec Pharma, Novartis, Roche, and Wyeth Pharmaceuticals.
CHICAGO – Combination induction therapy with adalimumab and methotrexate was superior to methotrexate alone for the treatment of patients with early rheumatoid arthritis in the randomized, controlled, multicenter HIT HARD trial.
The mean 28-joint count disease activity score (DAS28) was reduced from 6.2 to 3.0 after a 24-week induction phase in 87 RA patients randomized to receive combination therapy, which was significantly greater than the reduction from 6.3 to 3.5 in 85 patients randomized to receive methotrexate monotherapy for 24 weeks, according to Dr. Gerd R. Burmester reported at the annual meeting of the American College of Rheumatology.
After 24 weeks, adalimumab was withdrawn in the patients on combination therapy, and both groups received only methotrexate for an additional 24 weeks. The 48-week DAS28 – the primary end point of the double-blind study – remained numerically, but not significantly, lower in the group who initially received combination therapy, compared with the group who received initial monotherapy (3.2 vs. 3.4, respectively), said Dr. Burmester, professor of rheumatology at the Charité-Universitatsmedizin Berlin.
As with the primary outcome measure, the differences between the combination therapy and methotrexate monotherapy groups on secondary outcome measures, including remission (defined as DAS28 less than 2.6), ACR50 and ACR70 response rates (indicating 50% and 70% improvement in the signs and symptoms of disease), and functional status as measured by the Health Assessment Questionnaire (HAQ) differed significantly at week 24, but not at week 48.
For example, 47% vs. 31% of patients in the combination and monotherapy groups, respectively, achieved remission by week 24, but those percentages changed to 44% and 37% by week 48. Also, HAQ scores, which were 1.4 and 1.3 at baseline, were 0.49 vs. 0.72 at 24 weeks, compared with 0.60 and 0.65 at 48 weeks.
However, despite the lack of a significant difference in the DAS28 scores and the secondary outcome measures between the two groups at 48-week follow-up, there was a significant difference in regard to radiographic progression of disease: mean total van der Heijde/Sharp scores at that time were 6.3 and 11.4 in the groups, respectively, Dr. Burmester said. The extent to which this difference is clinically meaningful remains unclear, he noted.
Patients in the study had early RA with duration of no more than 1 year (with 87% having disease duration of 3 months or less) and had not been treated previously with disease-modifying antirheumatic drugs. All had active disease, swollen and tender joint counts of at least six, and C-reactive protein levels of at least 10 mg/L. Those in the combination therapy group received 15 mg/week of subcutaneous methotrexate plus 40 mg of adalimumab every other week; monotherapy patients received 15 mg/week of methotrexate plus placebo, and no new safety signals were detected during the course of the study.
The findings suggest that combination therapy with adalimumab and methotrexate is significantly superior to methotrexate alone for induction therapy in early RA. Combination therapy produces sustained benefits in regard to radiographic progression at 48 weeks, even when adalimumab is withdrawn at 24 weeks.
"The numerical increase in the clinical outcome parameters of the group on combination therapy, which became apparent at around 40 weeks following treatment initiation, may reflect the loss of response after removal of adalimumab and is a finding that requires additional study, Dr. Burmester noted.
In an unrelated study also presented at ACR 2011, Dr. Arthur F. Kavanaugh, professor of medicine, and director of the center for innovative therapy in the division of rheumatology, allergy, and immunology at the University of California at San Diego and his colleagues similarly found that patients on combination adalimumab and methotrexate who achieved stable low-disease activity were able to maintain that response following adalimumab withdrawal in the OPTIMA (Optimal Protocol for Methotrexate and Adalimumab Combination Therapy in Early RA) Trial. But they also found that those patients with a low HAQ score at baseline were more likely than were other patients to fare well after adalimumab withdrawal, a finding that suggests that those with higher scores might do better if combination therapy is continued.
Still, this question remains, according to Dr. Burmester: "Is it medically or economically justifiable to use this [combination] induction therapy, which rapidly leads to improvement, or should we just wait for the success of methotrexate later?"
Another question that warrants study is whether higher doses of methotrexate would be better, he said, adding, "But there is one very important message: Treat as early as possible."
This study was funded by the German Ministry of Science. Dr. Burmester disclosed financial and other relationships with Abbott, MSD, Pfizer, and UCB. Other study authors also disclosed relationships with these companies and several others, including Bristol-Myers Squibb, Horizon Pharma, Merck Pharma, Merck Serono, Nitec Pharma, Novartis, Roche, and Wyeth Pharmaceuticals.
FROM THE ANNUAL MEETING OF THE AMERICAN COLLEGE OF RHEUMATOLOGY
Major Finding: DAS28 was reduced from 6.2 to 3.0 after a 24-week induction phase in 87 RA patients randomized to receive combination therapy, which was significantly greater than the reduction from 6.3 to 3.5 in 85 patients randomized to receive methotrexate monotherapy for 24 weeks. After 24 weeks, adalimumab was withdrawn in the combination therapy patients, and both groups received only methotrexate for an additional 24 weeks; the 48-week DAS28 – the primary end point of the double-blind study – remained numerically, but not significantly, lower in the group who initially received combination therapy, compared with the group who received initial monotherapy (3.2 vs. 3.4, respectively).
Data Source: The randomized, controlled, double-blind HIT HARD trial.
Disclosures: This study was funded by the German Ministry of Science. Dr. Burmester disclosed financial and other relationships with Abbott, MSD, Pfizer, and UCB. Other study authors also disclosed relationships with these companies and several others, including Bristol-Myers Squibb, Horizon Pharma, Merck Pharma, Merck Serono, Nitec Pharma, Novartis, Roche, and Wyeth Pharmaceuticals.
Tots With Asthma Benefit From Intermittent Budesonide
An intermittent high-dose budesonide regimen and a daily low-dose regimen resulted in similar reductions in asthma exacerbations in preschool children with recurrent wheezing, but the intermittent regimen was associated with significantly less drug exposure, according to a report in the Nov. 24 issue of the New England Journal of Medicine.
In the randomized double-blind MIST (Maintenance and Intermittent Inhaled Corticosteroids in Wheezing Toddlers) trial, the rates of exacerbations per patient-year were 0.97 and 0.95 in the 139 patients who received daily budesonide and the 139 patients who received intermittent budesonide, respectively (relative rate in the intermittent regimen group of 0.99), Dr. Robert S. Zeiger of Kaiser Permanente, San Diego, and his colleagues said.
No significant differences were seen between the groups with respect to time to the first exacerbation (hazard ratio, 0.97), time to second exacerbation (HR, 0.79), or frequency of treatment failure, nor were significant differences seen between the groups on a number of prespecified secondary outcomes, including the rates of respiratory tract illness per patient-year, respiratory tract illnesses in which prednisolone was administered, frequency of treatment for respiratory tract illnesses, and time to the first treatment for respiratory tract illness.
The rates of nonserious adverse events and serious adverse events were also similar in the two groups.
In addition, the mean cumulative exposure to budesonide was reduced significantly – by 104 mg – over the year-long treatment period in the intermittent regimen group vs. the daily regimen group (45.7 mg vs. 149.9 mg), the investigators said (N. Engl. J. Med. 2011;365:1990-2001).
This finding is of note because an association between daily use of inhaled glucocorticoids and significant reductions in height growth was previously shown, they said.
"Concern about growth retardation and parental resistance to a daily regimen of inhaled glucocorticoids for young children, who usually have only episodic but often severe symptoms, stimulated a search for alternative strategies – specifically intermittent therapy with inhaled glucocorticoids," they explained, adding that intermittent 7-day courses initiated during respiratory tract illnesses were subsequently shown to significantly reduce the severity of respiratory symptoms, compared with placebo, without affecting linear growth.
The benefits were greatest in children with positive values on the modified asthma predictive index (API), they said.
The current study was conducted to determine whether a daily low-dose regimen would be superior to an intermittent dose in young children with positive values on the modified API, as well as recurrent wheezing, at least one exacerbation in the previous year, and low impairment defined by infrequent use of albuterol and infrequent night awakenings between episodes, they said.
The multicenter parallel-group trial included children aged 12-53 months. Treatment included a 2-week run-in period with nightly placebo doses of budesonide inhalation suspension plus albuterol given as needed in all patients. The run-in period was followed by a 52-week treatment period during which those in the intermittent treatment group received budesonide inhalation suspension given as 1 mg twice daily for 7 days when a respiratory tract illness was identified, and those in the daily treatment group received budesonide inhalation suspension given at a 0.5-mg dose every night. Corresponding placebo doses were also provided in each group.
The findings indicate that daily budesonide dosing is not superior to intermittent dosing for preventing asthma exacerbations in young children at risk for asthma and future exacerbations. The results have implications for the preparation of future guidelines, particularly given that current U.S. guidelines and Global Initiative for Asthma (GINA) guidelines recommend daily therapy with inhaled glucocorticoids as the preferred option for young children with recurrent wheezing and risk factors for persistent asthma. GINA guidelines also caution against daily high-dose therapy with inhaled glucocorticoids for prolonged periods, and instead recommend use of the lowest effective dose.
The MIST investigators noted that their findings may not be applicable to young children with asthma that is more severe or otherwise different from that in the children in their study.
"Daily or intermittent use of inhaled glucocorticoids, or even short courses of oral glucocorticoids started at the onset of wheezing episodes, may not be as efficacious in preschool-age children with a first episode, with transient or infrequent wheezing, or without an asthma diagnosis or a high risk of asthma," they said.
This study was supported by grants from the National Heart, Lung, and Blood Institute; the Washington University, St. Louis, Clinical and Translational Science Awards (CTSA) Infrastructure for Pediatric Research; the Madison CTSA; and the Colorado CTSA (via a grant from the National Center for Research Resources of the National Institutes of Health); and by grants to the General Clinical Research Centers at Washington University, National Jewish Health, and the University of New Mexico, Albuquerque. Study drug and matching placebo were donated by AstraZeneca. Dr. Zeiger and several other authors had numerous disclosures, including various financial relationships with pharmaceutical companies. The complete list of disclosures is available with the full text of the article at NEJM.org.
An intermittent high-dose budesonide regimen and a daily low-dose regimen resulted in similar reductions in asthma exacerbations in preschool children with recurrent wheezing, but the intermittent regimen was associated with significantly less drug exposure, according to a report in the Nov. 24 issue of the New England Journal of Medicine.
In the randomized double-blind MIST (Maintenance and Intermittent Inhaled Corticosteroids in Wheezing Toddlers) trial, the rates of exacerbations per patient-year were 0.97 and 0.95 in the 139 patients who received daily budesonide and the 139 patients who received intermittent budesonide, respectively (relative rate in the intermittent regimen group of 0.99), Dr. Robert S. Zeiger of Kaiser Permanente, San Diego, and his colleagues said.
No significant differences were seen between the groups with respect to time to the first exacerbation (hazard ratio, 0.97), time to second exacerbation (HR, 0.79), or frequency of treatment failure, nor were significant differences seen between the groups on a number of prespecified secondary outcomes, including the rates of respiratory tract illness per patient-year, respiratory tract illnesses in which prednisolone was administered, frequency of treatment for respiratory tract illnesses, and time to the first treatment for respiratory tract illness.
The rates of nonserious adverse events and serious adverse events were also similar in the two groups.
In addition, the mean cumulative exposure to budesonide was reduced significantly – by 104 mg – over the year-long treatment period in the intermittent regimen group vs. the daily regimen group (45.7 mg vs. 149.9 mg), the investigators said (N. Engl. J. Med. 2011;365:1990-2001).
This finding is of note because an association between daily use of inhaled glucocorticoids and significant reductions in height growth was previously shown, they said.
"Concern about growth retardation and parental resistance to a daily regimen of inhaled glucocorticoids for young children, who usually have only episodic but often severe symptoms, stimulated a search for alternative strategies – specifically intermittent therapy with inhaled glucocorticoids," they explained, adding that intermittent 7-day courses initiated during respiratory tract illnesses were subsequently shown to significantly reduce the severity of respiratory symptoms, compared with placebo, without affecting linear growth.
The benefits were greatest in children with positive values on the modified asthma predictive index (API), they said.
The current study was conducted to determine whether a daily low-dose regimen would be superior to an intermittent dose in young children with positive values on the modified API, as well as recurrent wheezing, at least one exacerbation in the previous year, and low impairment defined by infrequent use of albuterol and infrequent night awakenings between episodes, they said.
The multicenter parallel-group trial included children aged 12-53 months. Treatment included a 2-week run-in period with nightly placebo doses of budesonide inhalation suspension plus albuterol given as needed in all patients. The run-in period was followed by a 52-week treatment period during which those in the intermittent treatment group received budesonide inhalation suspension given as 1 mg twice daily for 7 days when a respiratory tract illness was identified, and those in the daily treatment group received budesonide inhalation suspension given at a 0.5-mg dose every night. Corresponding placebo doses were also provided in each group.
The findings indicate that daily budesonide dosing is not superior to intermittent dosing for preventing asthma exacerbations in young children at risk for asthma and future exacerbations. The results have implications for the preparation of future guidelines, particularly given that current U.S. guidelines and Global Initiative for Asthma (GINA) guidelines recommend daily therapy with inhaled glucocorticoids as the preferred option for young children with recurrent wheezing and risk factors for persistent asthma. GINA guidelines also caution against daily high-dose therapy with inhaled glucocorticoids for prolonged periods, and instead recommend use of the lowest effective dose.
The MIST investigators noted that their findings may not be applicable to young children with asthma that is more severe or otherwise different from that in the children in their study.
"Daily or intermittent use of inhaled glucocorticoids, or even short courses of oral glucocorticoids started at the onset of wheezing episodes, may not be as efficacious in preschool-age children with a first episode, with transient or infrequent wheezing, or without an asthma diagnosis or a high risk of asthma," they said.
This study was supported by grants from the National Heart, Lung, and Blood Institute; the Washington University, St. Louis, Clinical and Translational Science Awards (CTSA) Infrastructure for Pediatric Research; the Madison CTSA; and the Colorado CTSA (via a grant from the National Center for Research Resources of the National Institutes of Health); and by grants to the General Clinical Research Centers at Washington University, National Jewish Health, and the University of New Mexico, Albuquerque. Study drug and matching placebo were donated by AstraZeneca. Dr. Zeiger and several other authors had numerous disclosures, including various financial relationships with pharmaceutical companies. The complete list of disclosures is available with the full text of the article at NEJM.org.
An intermittent high-dose budesonide regimen and a daily low-dose regimen resulted in similar reductions in asthma exacerbations in preschool children with recurrent wheezing, but the intermittent regimen was associated with significantly less drug exposure, according to a report in the Nov. 24 issue of the New England Journal of Medicine.
In the randomized double-blind MIST (Maintenance and Intermittent Inhaled Corticosteroids in Wheezing Toddlers) trial, the rates of exacerbations per patient-year were 0.97 and 0.95 in the 139 patients who received daily budesonide and the 139 patients who received intermittent budesonide, respectively (relative rate in the intermittent regimen group of 0.99), Dr. Robert S. Zeiger of Kaiser Permanente, San Diego, and his colleagues said.
No significant differences were seen between the groups with respect to time to the first exacerbation (hazard ratio, 0.97), time to second exacerbation (HR, 0.79), or frequency of treatment failure, nor were significant differences seen between the groups on a number of prespecified secondary outcomes, including the rates of respiratory tract illness per patient-year, respiratory tract illnesses in which prednisolone was administered, frequency of treatment for respiratory tract illnesses, and time to the first treatment for respiratory tract illness.
The rates of nonserious adverse events and serious adverse events were also similar in the two groups.
In addition, the mean cumulative exposure to budesonide was reduced significantly – by 104 mg – over the year-long treatment period in the intermittent regimen group vs. the daily regimen group (45.7 mg vs. 149.9 mg), the investigators said (N. Engl. J. Med. 2011;365:1990-2001).
This finding is of note because an association between daily use of inhaled glucocorticoids and significant reductions in height growth was previously shown, they said.
"Concern about growth retardation and parental resistance to a daily regimen of inhaled glucocorticoids for young children, who usually have only episodic but often severe symptoms, stimulated a search for alternative strategies – specifically intermittent therapy with inhaled glucocorticoids," they explained, adding that intermittent 7-day courses initiated during respiratory tract illnesses were subsequently shown to significantly reduce the severity of respiratory symptoms, compared with placebo, without affecting linear growth.
The benefits were greatest in children with positive values on the modified asthma predictive index (API), they said.
The current study was conducted to determine whether a daily low-dose regimen would be superior to an intermittent dose in young children with positive values on the modified API, as well as recurrent wheezing, at least one exacerbation in the previous year, and low impairment defined by infrequent use of albuterol and infrequent night awakenings between episodes, they said.
The multicenter parallel-group trial included children aged 12-53 months. Treatment included a 2-week run-in period with nightly placebo doses of budesonide inhalation suspension plus albuterol given as needed in all patients. The run-in period was followed by a 52-week treatment period during which those in the intermittent treatment group received budesonide inhalation suspension given as 1 mg twice daily for 7 days when a respiratory tract illness was identified, and those in the daily treatment group received budesonide inhalation suspension given at a 0.5-mg dose every night. Corresponding placebo doses were also provided in each group.
The findings indicate that daily budesonide dosing is not superior to intermittent dosing for preventing asthma exacerbations in young children at risk for asthma and future exacerbations. The results have implications for the preparation of future guidelines, particularly given that current U.S. guidelines and Global Initiative for Asthma (GINA) guidelines recommend daily therapy with inhaled glucocorticoids as the preferred option for young children with recurrent wheezing and risk factors for persistent asthma. GINA guidelines also caution against daily high-dose therapy with inhaled glucocorticoids for prolonged periods, and instead recommend use of the lowest effective dose.
The MIST investigators noted that their findings may not be applicable to young children with asthma that is more severe or otherwise different from that in the children in their study.
"Daily or intermittent use of inhaled glucocorticoids, or even short courses of oral glucocorticoids started at the onset of wheezing episodes, may not be as efficacious in preschool-age children with a first episode, with transient or infrequent wheezing, or without an asthma diagnosis or a high risk of asthma," they said.
This study was supported by grants from the National Heart, Lung, and Blood Institute; the Washington University, St. Louis, Clinical and Translational Science Awards (CTSA) Infrastructure for Pediatric Research; the Madison CTSA; and the Colorado CTSA (via a grant from the National Center for Research Resources of the National Institutes of Health); and by grants to the General Clinical Research Centers at Washington University, National Jewish Health, and the University of New Mexico, Albuquerque. Study drug and matching placebo were donated by AstraZeneca. Dr. Zeiger and several other authors had numerous disclosures, including various financial relationships with pharmaceutical companies. The complete list of disclosures is available with the full text of the article at NEJM.org.
FROM THE NEW ENGLAND JOURNAL OF MEDICINE
Major Finding: Rates of exacerbations per patient-year were 0.97 and 0.95 with daily budesonide and intermittent budesonide, respectively. Mean cumulative exposure to budesonide was reduced significantly in the intermittent regimen group vs. the daily regimen group (45.7 mg vs. 149.9 mg).
Data Source: Randomized, double-blind MIST trial.
Disclosures: This study was supported by grants from the National Heart, Lung, and Blood Institute; the Washington University, St. Louis, Clinical and Translational Science Awards (CTSA) Infrastructure for Pediatric Research; the Madison CTSA; and the Colorado CTSA (via a grant from the National Center for Research Resources of the National Institutes of Health); and by grants to the General Clinical Research Centers at Washington University, National Jewish Health, and the University of New Mexico, Albuquerque. Study drug and matching placebo were donated by AstraZeneca. Dr. Zeiger and several other authors had numerous disclosures, including various financial relationships with pharmaceutical companies. The complete list of disclosures is available with the full text of the article at NEJM.org.
Adverse Events in Elderly Mostly From Common Drugs
Nearly 100,000 Americans aged 65 years and older are hospitalized each year after adverse drug events, and most of these emergencies result from a few commonly used medications, according to a study from the Centers for Disease Control and Prevention.
Of an estimated 99,628 hospitalizations – about half coming among people aged 80 years and older – 66% were the result of unintentional overdoses, Dr. Daniel S. Budnitz, director of the CDC’s Medication Safety Program, and his colleagues reported in the Nov. 24 issue of the New England Journal of Medicine.
The medications most often implicated were warfarin, insulins, oral antiplatelet agents, and oral hypoglycemic agents, which either alone or in combination accounted for 33%, 14%, 13%, and 11% of the hospitalizations, respectively, the investigators reported (N. Engl. J. Med. 2011;365:2002-12).
Emergency department visits resulting in hospitalization were more likely than those not resulting in hospitalization to involve unintentional overdoses (66% vs. 46%) and to involve at least five medications (55% vs. 40%).
The researchers noted that medications typically deemed high risk or inappropriate for elderly patients resulted in relatively few emergency hospitalizations, accounting for only about 1% and 7% of the 65-and-over emergency admissions, respectively.
The findings have important implications for reducing harm and health care costs among older adults, said Dr. Budnitz and his colleagues. Such detailed and drug-specific data can help focus current patient-safety efforts, such as the Partnership for Patients, they said. That $1 billion federal initiative has the goal of decreasing preventable hospitalizations 20% by the end of 2013.
The investigators used adverse-event data for 2007 through 2009 from the National Electronic Injury Surveillance System – Cooperative Adverse Drug Event Surveillance Project. They based their national estimates on an analysis of 5,077 emergency hospitalizations in older adults at 58 nonpediatric hospitals that participate in the surveillance system.
The numbers probably represent underestimates of emergency hospitalizations, the investigators said. They explained that some patient groups – such as those admitted for diagnostic evaluation or transferred from other hospitals – were not represented in the analysis.
"Our findings suggest that efforts to improve medication safety for older adults should focus on areas in which improvements are most likely to have sizable, clinically significant, and measurable effects, such as improving the management of antithrombotic and antidiabetic drugs," the investigators concluded.
In a press statement, Dr. Budnitz acknowledged that blood thinners and diabetes medications are critical medicines for many older adults. "Doctors and patients should continue to use them but remember to work together to safely manage them," he said.
Neither Dr. Budnitz nor any of the other investigators disclosed any conflict of interest.
Nearly 100,000 Americans aged 65 years and older are hospitalized each year after adverse drug events, and most of these emergencies result from a few commonly used medications, according to a study from the Centers for Disease Control and Prevention.
Of an estimated 99,628 hospitalizations – about half coming among people aged 80 years and older – 66% were the result of unintentional overdoses, Dr. Daniel S. Budnitz, director of the CDC’s Medication Safety Program, and his colleagues reported in the Nov. 24 issue of the New England Journal of Medicine.
The medications most often implicated were warfarin, insulins, oral antiplatelet agents, and oral hypoglycemic agents, which either alone or in combination accounted for 33%, 14%, 13%, and 11% of the hospitalizations, respectively, the investigators reported (N. Engl. J. Med. 2011;365:2002-12).
Emergency department visits resulting in hospitalization were more likely than those not resulting in hospitalization to involve unintentional overdoses (66% vs. 46%) and to involve at least five medications (55% vs. 40%).
The researchers noted that medications typically deemed high risk or inappropriate for elderly patients resulted in relatively few emergency hospitalizations, accounting for only about 1% and 7% of the 65-and-over emergency admissions, respectively.
The findings have important implications for reducing harm and health care costs among older adults, said Dr. Budnitz and his colleagues. Such detailed and drug-specific data can help focus current patient-safety efforts, such as the Partnership for Patients, they said. That $1 billion federal initiative has the goal of decreasing preventable hospitalizations 20% by the end of 2013.
The investigators used adverse-event data for 2007 through 2009 from the National Electronic Injury Surveillance System – Cooperative Adverse Drug Event Surveillance Project. They based their national estimates on an analysis of 5,077 emergency hospitalizations in older adults at 58 nonpediatric hospitals that participate in the surveillance system.
The numbers probably represent underestimates of emergency hospitalizations, the investigators said. They explained that some patient groups – such as those admitted for diagnostic evaluation or transferred from other hospitals – were not represented in the analysis.
"Our findings suggest that efforts to improve medication safety for older adults should focus on areas in which improvements are most likely to have sizable, clinically significant, and measurable effects, such as improving the management of antithrombotic and antidiabetic drugs," the investigators concluded.
In a press statement, Dr. Budnitz acknowledged that blood thinners and diabetes medications are critical medicines for many older adults. "Doctors and patients should continue to use them but remember to work together to safely manage them," he said.
Neither Dr. Budnitz nor any of the other investigators disclosed any conflict of interest.
Nearly 100,000 Americans aged 65 years and older are hospitalized each year after adverse drug events, and most of these emergencies result from a few commonly used medications, according to a study from the Centers for Disease Control and Prevention.
Of an estimated 99,628 hospitalizations – about half coming among people aged 80 years and older – 66% were the result of unintentional overdoses, Dr. Daniel S. Budnitz, director of the CDC’s Medication Safety Program, and his colleagues reported in the Nov. 24 issue of the New England Journal of Medicine.
The medications most often implicated were warfarin, insulins, oral antiplatelet agents, and oral hypoglycemic agents, which either alone or in combination accounted for 33%, 14%, 13%, and 11% of the hospitalizations, respectively, the investigators reported (N. Engl. J. Med. 2011;365:2002-12).
Emergency department visits resulting in hospitalization were more likely than those not resulting in hospitalization to involve unintentional overdoses (66% vs. 46%) and to involve at least five medications (55% vs. 40%).
The researchers noted that medications typically deemed high risk or inappropriate for elderly patients resulted in relatively few emergency hospitalizations, accounting for only about 1% and 7% of the 65-and-over emergency admissions, respectively.
The findings have important implications for reducing harm and health care costs among older adults, said Dr. Budnitz and his colleagues. Such detailed and drug-specific data can help focus current patient-safety efforts, such as the Partnership for Patients, they said. That $1 billion federal initiative has the goal of decreasing preventable hospitalizations 20% by the end of 2013.
The investigators used adverse-event data for 2007 through 2009 from the National Electronic Injury Surveillance System – Cooperative Adverse Drug Event Surveillance Project. They based their national estimates on an analysis of 5,077 emergency hospitalizations in older adults at 58 nonpediatric hospitals that participate in the surveillance system.
The numbers probably represent underestimates of emergency hospitalizations, the investigators said. They explained that some patient groups – such as those admitted for diagnostic evaluation or transferred from other hospitals – were not represented in the analysis.
"Our findings suggest that efforts to improve medication safety for older adults should focus on areas in which improvements are most likely to have sizable, clinically significant, and measurable effects, such as improving the management of antithrombotic and antidiabetic drugs," the investigators concluded.
In a press statement, Dr. Budnitz acknowledged that blood thinners and diabetes medications are critical medicines for many older adults. "Doctors and patients should continue to use them but remember to work together to safely manage them," he said.
Neither Dr. Budnitz nor any of the other investigators disclosed any conflict of interest.
FROM THE NEW ENGLAND JOURNAL OF MEDICINE
Major Finding: The medications most often implicated in emergency hospitalizations for adverse drug events in people aged 65 years and older were warfarin, insulins, oral antiplatelet agents, and oral hypoglycemic agents. Either alone or in combination, those drugs accounted for 33%, 14%, 13%, and 11% of the hospitalizations, respectively.
Data Source: An analysis of adverse-event data from the National Electronic Injury Surveillance System – Cooperative Adverse Drug Event Surveillance Project.
Disclosures: None of the investigators reported any conflict of interest.