Get the Facts Straight to Improve Bone Health

Article Type
Changed
Display Headline
Get the Facts Straight to Improve Bone Health

NEW ORLEANS — With a virtual alphabet soup of vitamin and mineral supplements available—and a constant barrage of new nutritional advice each week—it's a challenge to know what truly bolsters bone health, Neil Binkley, M.D., said at the annual meeting of the International Society for Clinical Densitometry.

General malnutrition is actually a common phenomenon in the United States, said Dr. Binkley, of the Institute on Aging at the University of Wisconsin in Madison. According to one study, 11% of patients older than 65 are undernourished. More importantly for bone health, some studies have suggested that elderly patients with fractures are more likely to be malnourished. Keeping an eye out for malnourished patients could help reduce the risk of falls and fractures, he said.

Although supplements provide an easy solution, food is still the best source of vitamins and minerals. Dr. Binkley shared the following diet and nutrition tips:

Phosphorus

Phosphorus insufficiency is generally not a common problem, but it tends to occur in some of the more vulnerable populations. Phosphorus deficiency decreases mineralization and osteoblast function while increasing osteoclast function.

An estimated 15% of women over age 80 receive less than 70% of the U.S. recommended daily allowance (RDA) of phosphorus (1,000 mg). It has also been suggested that patients who fail to respond to calcium supplementation may, in fact, have inadequate phosphorus intake.

Vitamin D

Make sure patients are aware that not all dairy products are fortified with vitamin D. “You can't get vitamin D in food unless you happen to like liver or lots of salmon or mackerel,” Dr. Binkley said.

Vitamin D toxicity is less of a concern than it once was. Recommended levels range from 1,200 to 1,500 IU per day. Levels exceeding 10,000 IU per day are believed toxic, so there is a large margin of error.

Supplements may be necessary to get enough vitamin D. Dr. Binkley noted to get 1,000 IU vitamin D, you would need to drink half a gallon of milk or eat 40 egg yolks. Getting a little sun is also an option.

The bigger problem with ensuring that patients get enough vitamin D may be in obtaining a good assay, Dr. Binkley said. In one study, he and his colleagues used four different assays to measure patient vitamin D levels. Although the four methods agreed quite well for some patients, there were big differences for other patients.

Attempts to standardize vitamin D assays are ongoing. High-performance liquid chromatography (HPLC) appears to provide the best results. Dr. Binkley advised that if HPLC and commercial assays agree that a patient's vitamin D levels are low, they probably are. But if commercial assays indicate a patient's levels are not low, consider HPLC. He also noted if you give your patient very high, prescription-level doses (50,000 IU) of vitamin D, at least one of the commercial 25-hydroxy assays only detects about half of it in the blood.

Vitamin A

A family of about 25 compounds constitute vitamin A, but the active component is retinol. The RDA for vitamin A is 2,600 IU (800 mcg) per day for men and 2,300 IU (700 mcg) per day for women.

The effects of getting too much vitamin A are unclear. Generally, it's been assumed that the body has built-in safeguards to avoid vitamin A toxicity.

Yet it's theorized that excessive vitamin A will suppress osteoblast activity and stimulate bone reabsorption. In addition, epidemiologic data suggest that the consumption of more than 5,000 IU daily increases fracture risk—but clinical studies have not confirmed this association.

Dietary sources of vitamin A include liver, fish, and fortified foods such as dairy products; certain fruits and vegetables are high in carotenoids. Vitamin A supplementation is considered necessary only in special situations, and patients should be counseled never to take synthetic retinol.

Vitamin K

Low vitamin K levels have been reported in patients with osteoporotic fractures and epidemiologic data show an increased risk of hip fracture with low levels of vitamin K. But vitamin K doesn't linger in the blood for very long, so it's difficult to get an accurate measure, Dr. Binkley said.

Most existing data come from Japan, where a different form of vitamin K is taken from that used in the United States. The Japanese studies used 45 mg per day and showed sustained levels of bone mineral density (BMD) and vertebral fracture-prevention benefits. Adequate intake of vitamin K in the U.S., however, is about 100 mcg per day. It's probably too early to recommend supplementation, he concluded.

 

 

Magnesium

Inadequate magnesium is associated with decreased parathyroid hormone. Epidemiologic studies suggest a positive association between increased magnesium intake and BMD. But data from the Women's Health Initiative found high magnesium intake was not protective of BMD.

The bottom line is to eat foods that contain magnesium, including whole grains, vegetables, and nuts. There are no data to support the use of magnesium supplements, Dr. Binkley said.

Caffeine

It's been assumed that caffeine is harmful to bone because it leads to increased urinary calcium loss. But several studies have shown decreased calcium absorption is actually what occurs. “The gist is that for each cup of coffee that we drink, there is a calcium loss of about 5 mg.” That means “we need to put about 2 tablespoons of milk in our coffee,” Dr. Binkley said.

Protein

One study of elderly patients found patients getting protein supplements were less likely to have fractures. In fact, those with higher protein intake and adequate calcium had the best outcomes, suggesting there may be a synergistic effect between protein and calcium.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

NEW ORLEANS — With a virtual alphabet soup of vitamin and mineral supplements available—and a constant barrage of new nutritional advice each week—it's a challenge to know what truly bolsters bone health, Neil Binkley, M.D., said at the annual meeting of the International Society for Clinical Densitometry.

General malnutrition is actually a common phenomenon in the United States, said Dr. Binkley, of the Institute on Aging at the University of Wisconsin in Madison. According to one study, 11% of patients older than 65 are undernourished. More importantly for bone health, some studies have suggested that elderly patients with fractures are more likely to be malnourished. Keeping an eye out for malnourished patients could help reduce the risk of falls and fractures, he said.

Although supplements provide an easy solution, food is still the best source of vitamins and minerals. Dr. Binkley shared the following diet and nutrition tips:

Phosphorus

Phosphorus insufficiency is generally not a common problem, but it tends to occur in some of the more vulnerable populations. Phosphorus deficiency decreases mineralization and osteoblast function while increasing osteoclast function.

An estimated 15% of women over age 80 receive less than 70% of the U.S. recommended daily allowance (RDA) of phosphorus (1,000 mg). It has also been suggested that patients who fail to respond to calcium supplementation may, in fact, have inadequate phosphorus intake.

Vitamin D

Make sure patients are aware that not all dairy products are fortified with vitamin D. “You can't get vitamin D in food unless you happen to like liver or lots of salmon or mackerel,” Dr. Binkley said.

Vitamin D toxicity is less of a concern than it once was. Recommended levels range from 1,200 to 1,500 IU per day. Levels exceeding 10,000 IU per day are believed toxic, so there is a large margin of error.

Supplements may be necessary to get enough vitamin D. Dr. Binkley noted to get 1,000 IU vitamin D, you would need to drink half a gallon of milk or eat 40 egg yolks. Getting a little sun is also an option.

The bigger problem with ensuring that patients get enough vitamin D may be in obtaining a good assay, Dr. Binkley said. In one study, he and his colleagues used four different assays to measure patient vitamin D levels. Although the four methods agreed quite well for some patients, there were big differences for other patients.

Attempts to standardize vitamin D assays are ongoing. High-performance liquid chromatography (HPLC) appears to provide the best results. Dr. Binkley advised that if HPLC and commercial assays agree that a patient's vitamin D levels are low, they probably are. But if commercial assays indicate a patient's levels are not low, consider HPLC. He also noted if you give your patient very high, prescription-level doses (50,000 IU) of vitamin D, at least one of the commercial 25-hydroxy assays only detects about half of it in the blood.

Vitamin A

A family of about 25 compounds constitute vitamin A, but the active component is retinol. The RDA for vitamin A is 2,600 IU (800 mcg) per day for men and 2,300 IU (700 mcg) per day for women.

The effects of getting too much vitamin A are unclear. Generally, it's been assumed that the body has built-in safeguards to avoid vitamin A toxicity.

Yet it's theorized that excessive vitamin A will suppress osteoblast activity and stimulate bone reabsorption. In addition, epidemiologic data suggest that the consumption of more than 5,000 IU daily increases fracture risk—but clinical studies have not confirmed this association.

Dietary sources of vitamin A include liver, fish, and fortified foods such as dairy products; certain fruits and vegetables are high in carotenoids. Vitamin A supplementation is considered necessary only in special situations, and patients should be counseled never to take synthetic retinol.

Vitamin K

Low vitamin K levels have been reported in patients with osteoporotic fractures and epidemiologic data show an increased risk of hip fracture with low levels of vitamin K. But vitamin K doesn't linger in the blood for very long, so it's difficult to get an accurate measure, Dr. Binkley said.

Most existing data come from Japan, where a different form of vitamin K is taken from that used in the United States. The Japanese studies used 45 mg per day and showed sustained levels of bone mineral density (BMD) and vertebral fracture-prevention benefits. Adequate intake of vitamin K in the U.S., however, is about 100 mcg per day. It's probably too early to recommend supplementation, he concluded.

 

 

Magnesium

Inadequate magnesium is associated with decreased parathyroid hormone. Epidemiologic studies suggest a positive association between increased magnesium intake and BMD. But data from the Women's Health Initiative found high magnesium intake was not protective of BMD.

The bottom line is to eat foods that contain magnesium, including whole grains, vegetables, and nuts. There are no data to support the use of magnesium supplements, Dr. Binkley said.

Caffeine

It's been assumed that caffeine is harmful to bone because it leads to increased urinary calcium loss. But several studies have shown decreased calcium absorption is actually what occurs. “The gist is that for each cup of coffee that we drink, there is a calcium loss of about 5 mg.” That means “we need to put about 2 tablespoons of milk in our coffee,” Dr. Binkley said.

Protein

One study of elderly patients found patients getting protein supplements were less likely to have fractures. In fact, those with higher protein intake and adequate calcium had the best outcomes, suggesting there may be a synergistic effect between protein and calcium.

NEW ORLEANS — With a virtual alphabet soup of vitamin and mineral supplements available—and a constant barrage of new nutritional advice each week—it's a challenge to know what truly bolsters bone health, Neil Binkley, M.D., said at the annual meeting of the International Society for Clinical Densitometry.

General malnutrition is actually a common phenomenon in the United States, said Dr. Binkley, of the Institute on Aging at the University of Wisconsin in Madison. According to one study, 11% of patients older than 65 are undernourished. More importantly for bone health, some studies have suggested that elderly patients with fractures are more likely to be malnourished. Keeping an eye out for malnourished patients could help reduce the risk of falls and fractures, he said.

Although supplements provide an easy solution, food is still the best source of vitamins and minerals. Dr. Binkley shared the following diet and nutrition tips:

Phosphorus

Phosphorus insufficiency is generally not a common problem, but it tends to occur in some of the more vulnerable populations. Phosphorus deficiency decreases mineralization and osteoblast function while increasing osteoclast function.

An estimated 15% of women over age 80 receive less than 70% of the U.S. recommended daily allowance (RDA) of phosphorus (1,000 mg). It has also been suggested that patients who fail to respond to calcium supplementation may, in fact, have inadequate phosphorus intake.

Vitamin D

Make sure patients are aware that not all dairy products are fortified with vitamin D. “You can't get vitamin D in food unless you happen to like liver or lots of salmon or mackerel,” Dr. Binkley said.

Vitamin D toxicity is less of a concern than it once was. Recommended levels range from 1,200 to 1,500 IU per day. Levels exceeding 10,000 IU per day are believed toxic, so there is a large margin of error.

Supplements may be necessary to get enough vitamin D. Dr. Binkley noted to get 1,000 IU vitamin D, you would need to drink half a gallon of milk or eat 40 egg yolks. Getting a little sun is also an option.

The bigger problem with ensuring that patients get enough vitamin D may be in obtaining a good assay, Dr. Binkley said. In one study, he and his colleagues used four different assays to measure patient vitamin D levels. Although the four methods agreed quite well for some patients, there were big differences for other patients.

Attempts to standardize vitamin D assays are ongoing. High-performance liquid chromatography (HPLC) appears to provide the best results. Dr. Binkley advised that if HPLC and commercial assays agree that a patient's vitamin D levels are low, they probably are. But if commercial assays indicate a patient's levels are not low, consider HPLC. He also noted if you give your patient very high, prescription-level doses (50,000 IU) of vitamin D, at least one of the commercial 25-hydroxy assays only detects about half of it in the blood.

Vitamin A

A family of about 25 compounds constitute vitamin A, but the active component is retinol. The RDA for vitamin A is 2,600 IU (800 mcg) per day for men and 2,300 IU (700 mcg) per day for women.

The effects of getting too much vitamin A are unclear. Generally, it's been assumed that the body has built-in safeguards to avoid vitamin A toxicity.

Yet it's theorized that excessive vitamin A will suppress osteoblast activity and stimulate bone reabsorption. In addition, epidemiologic data suggest that the consumption of more than 5,000 IU daily increases fracture risk—but clinical studies have not confirmed this association.

Dietary sources of vitamin A include liver, fish, and fortified foods such as dairy products; certain fruits and vegetables are high in carotenoids. Vitamin A supplementation is considered necessary only in special situations, and patients should be counseled never to take synthetic retinol.

Vitamin K

Low vitamin K levels have been reported in patients with osteoporotic fractures and epidemiologic data show an increased risk of hip fracture with low levels of vitamin K. But vitamin K doesn't linger in the blood for very long, so it's difficult to get an accurate measure, Dr. Binkley said.

Most existing data come from Japan, where a different form of vitamin K is taken from that used in the United States. The Japanese studies used 45 mg per day and showed sustained levels of bone mineral density (BMD) and vertebral fracture-prevention benefits. Adequate intake of vitamin K in the U.S., however, is about 100 mcg per day. It's probably too early to recommend supplementation, he concluded.

 

 

Magnesium

Inadequate magnesium is associated with decreased parathyroid hormone. Epidemiologic studies suggest a positive association between increased magnesium intake and BMD. But data from the Women's Health Initiative found high magnesium intake was not protective of BMD.

The bottom line is to eat foods that contain magnesium, including whole grains, vegetables, and nuts. There are no data to support the use of magnesium supplements, Dr. Binkley said.

Caffeine

It's been assumed that caffeine is harmful to bone because it leads to increased urinary calcium loss. But several studies have shown decreased calcium absorption is actually what occurs. “The gist is that for each cup of coffee that we drink, there is a calcium loss of about 5 mg.” That means “we need to put about 2 tablespoons of milk in our coffee,” Dr. Binkley said.

Protein

One study of elderly patients found patients getting protein supplements were less likely to have fractures. In fact, those with higher protein intake and adequate calcium had the best outcomes, suggesting there may be a synergistic effect between protein and calcium.

Publications
Publications
Topics
Article Type
Display Headline
Get the Facts Straight to Improve Bone Health
Display Headline
Get the Facts Straight to Improve Bone Health
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Team-Based Approach Key to Care In Peripartum Cardiomyopathy

Article Type
Changed
Display Headline
Team-Based Approach Key to Care In Peripartum Cardiomyopathy

ASHEVILLE, N.C. — Focus on the woman's health in those rare cases of peripartum cardiomyopathy, said Thomas S. Ivester, M.D., at the Southern Obstetric and Gynecologic Seminar. “Maternal health is of paramount importance in this situation,” Dr. Ivester, of the department of maternal-fetal medicine said during the University of North Carolina at Chapel Hill.

Cardiomyopathy is an infrequent but potentially fatal complication of pregnancy. The mortality rate is 0.4 per 100,000 live births. Risk factors during pregnancy include multiparity, advanced age, African American race, and preeclampsia.

Care of critically ill pregnant women requires a team-based approach, with good communication among caregivers and specialists. Obstetricians can serve a vital role in educating critical care colleagues about treating pregnant patients who are critically ill.

In particular, “cardiac indices and central venous pressure are notoriously inaccurate in critically ill gravida. This is especially so with preeclampsia,” said Dr. Ivester. Use echocardiography to assess volume or use a P.A. catheter to get a wedge pressure.

Fetal decompensation is frequently a warning sign of subsequent significant maternal decompensation. “Once it's detected, cardiac monitoring of the fetus should probably be ceased until the mom is completely stabilized. Intervention in that scenario is probably ill advised,” said Dr. Ivester.

In patients who have significant hemorrhage or in those who may have suffered some type of hypovolemic insult or have been in shock, dopamine can be used to preserve and enhance renal and placental perfusion. “So a renal dose of dopamine, you can also consider as a placental dose of dopamine,” Dr. Ivester said.

Whenever possible, delivery should be reserved for obstetric indications. Vaginal delivery is preferred, because it is tolerated better by the woman. These patients should have prophylaxis for deep vein thrombosis, which can be accomplished by mechanical or chemical means.

“Close follow-up of any case of peripartum cardiomyopathy is critical,” Dr. Ivester said. He suggests serial echocardiography to evaluate the recovery of left ventricular function. Avoiding subsequent pregnancies until function improves is important, so make sure these patients are on adequate contraception. Earlier ICD implantation or placement on a transplant list should be considered for patients who suffer significant rhythm deterioration or have persistently low ejection fractions.

“Most importantly, … obstetric issues do not disappear with delivery. [The mother] is still an obstetric patient, even when the baby is delivered,” Dr. Ivester said. Peripartum changes can persist in some women for many weeks after delivery, and the obstetrician still has an important role to play in their care, especially in helping to differentiate the changes associated with pregnancy from other conditions.

In a normal pregnancy, blood volume increases 50%–100%. Systemic vascular resistance decreases 20%, and the blood is hypercoagulable. Cardiac output can fluctuate. Respiratory alkalosis may occur. The heart is displaced upward and to the left. The patient will have slight left ventricular hypertrophy and effusion that can be seen on echocardiography. There is frequently a left axis deviation due to these changes. There also may be nonspecific ST segment and T-wave changes.

Profound cardiac changes also occur during labor. Systemic vascular resistance can go up 10%–25% with each contraction. “That's a substantial increase for a patient with a very sick myocardium or those with significant valvular diseases,” Dr. Ivester said. Women in labor will autoinfuse 300–500 cc every time they contract, especially if they are near term. Cardiac output fluctuates as labor progresses. In early labor (<3 cm), cardiac output goes up about 17%. In the second stage of labor (> 8 cm), cardiac output increases at least 34%.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

ASHEVILLE, N.C. — Focus on the woman's health in those rare cases of peripartum cardiomyopathy, said Thomas S. Ivester, M.D., at the Southern Obstetric and Gynecologic Seminar. “Maternal health is of paramount importance in this situation,” Dr. Ivester, of the department of maternal-fetal medicine said during the University of North Carolina at Chapel Hill.

Cardiomyopathy is an infrequent but potentially fatal complication of pregnancy. The mortality rate is 0.4 per 100,000 live births. Risk factors during pregnancy include multiparity, advanced age, African American race, and preeclampsia.

Care of critically ill pregnant women requires a team-based approach, with good communication among caregivers and specialists. Obstetricians can serve a vital role in educating critical care colleagues about treating pregnant patients who are critically ill.

In particular, “cardiac indices and central venous pressure are notoriously inaccurate in critically ill gravida. This is especially so with preeclampsia,” said Dr. Ivester. Use echocardiography to assess volume or use a P.A. catheter to get a wedge pressure.

Fetal decompensation is frequently a warning sign of subsequent significant maternal decompensation. “Once it's detected, cardiac monitoring of the fetus should probably be ceased until the mom is completely stabilized. Intervention in that scenario is probably ill advised,” said Dr. Ivester.

In patients who have significant hemorrhage or in those who may have suffered some type of hypovolemic insult or have been in shock, dopamine can be used to preserve and enhance renal and placental perfusion. “So a renal dose of dopamine, you can also consider as a placental dose of dopamine,” Dr. Ivester said.

Whenever possible, delivery should be reserved for obstetric indications. Vaginal delivery is preferred, because it is tolerated better by the woman. These patients should have prophylaxis for deep vein thrombosis, which can be accomplished by mechanical or chemical means.

“Close follow-up of any case of peripartum cardiomyopathy is critical,” Dr. Ivester said. He suggests serial echocardiography to evaluate the recovery of left ventricular function. Avoiding subsequent pregnancies until function improves is important, so make sure these patients are on adequate contraception. Earlier ICD implantation or placement on a transplant list should be considered for patients who suffer significant rhythm deterioration or have persistently low ejection fractions.

“Most importantly, … obstetric issues do not disappear with delivery. [The mother] is still an obstetric patient, even when the baby is delivered,” Dr. Ivester said. Peripartum changes can persist in some women for many weeks after delivery, and the obstetrician still has an important role to play in their care, especially in helping to differentiate the changes associated with pregnancy from other conditions.

In a normal pregnancy, blood volume increases 50%–100%. Systemic vascular resistance decreases 20%, and the blood is hypercoagulable. Cardiac output can fluctuate. Respiratory alkalosis may occur. The heart is displaced upward and to the left. The patient will have slight left ventricular hypertrophy and effusion that can be seen on echocardiography. There is frequently a left axis deviation due to these changes. There also may be nonspecific ST segment and T-wave changes.

Profound cardiac changes also occur during labor. Systemic vascular resistance can go up 10%–25% with each contraction. “That's a substantial increase for a patient with a very sick myocardium or those with significant valvular diseases,” Dr. Ivester said. Women in labor will autoinfuse 300–500 cc every time they contract, especially if they are near term. Cardiac output fluctuates as labor progresses. In early labor (<3 cm), cardiac output goes up about 17%. In the second stage of labor (> 8 cm), cardiac output increases at least 34%.

ASHEVILLE, N.C. — Focus on the woman's health in those rare cases of peripartum cardiomyopathy, said Thomas S. Ivester, M.D., at the Southern Obstetric and Gynecologic Seminar. “Maternal health is of paramount importance in this situation,” Dr. Ivester, of the department of maternal-fetal medicine said during the University of North Carolina at Chapel Hill.

Cardiomyopathy is an infrequent but potentially fatal complication of pregnancy. The mortality rate is 0.4 per 100,000 live births. Risk factors during pregnancy include multiparity, advanced age, African American race, and preeclampsia.

Care of critically ill pregnant women requires a team-based approach, with good communication among caregivers and specialists. Obstetricians can serve a vital role in educating critical care colleagues about treating pregnant patients who are critically ill.

In particular, “cardiac indices and central venous pressure are notoriously inaccurate in critically ill gravida. This is especially so with preeclampsia,” said Dr. Ivester. Use echocardiography to assess volume or use a P.A. catheter to get a wedge pressure.

Fetal decompensation is frequently a warning sign of subsequent significant maternal decompensation. “Once it's detected, cardiac monitoring of the fetus should probably be ceased until the mom is completely stabilized. Intervention in that scenario is probably ill advised,” said Dr. Ivester.

In patients who have significant hemorrhage or in those who may have suffered some type of hypovolemic insult or have been in shock, dopamine can be used to preserve and enhance renal and placental perfusion. “So a renal dose of dopamine, you can also consider as a placental dose of dopamine,” Dr. Ivester said.

Whenever possible, delivery should be reserved for obstetric indications. Vaginal delivery is preferred, because it is tolerated better by the woman. These patients should have prophylaxis for deep vein thrombosis, which can be accomplished by mechanical or chemical means.

“Close follow-up of any case of peripartum cardiomyopathy is critical,” Dr. Ivester said. He suggests serial echocardiography to evaluate the recovery of left ventricular function. Avoiding subsequent pregnancies until function improves is important, so make sure these patients are on adequate contraception. Earlier ICD implantation or placement on a transplant list should be considered for patients who suffer significant rhythm deterioration or have persistently low ejection fractions.

“Most importantly, … obstetric issues do not disappear with delivery. [The mother] is still an obstetric patient, even when the baby is delivered,” Dr. Ivester said. Peripartum changes can persist in some women for many weeks after delivery, and the obstetrician still has an important role to play in their care, especially in helping to differentiate the changes associated with pregnancy from other conditions.

In a normal pregnancy, blood volume increases 50%–100%. Systemic vascular resistance decreases 20%, and the blood is hypercoagulable. Cardiac output can fluctuate. Respiratory alkalosis may occur. The heart is displaced upward and to the left. The patient will have slight left ventricular hypertrophy and effusion that can be seen on echocardiography. There is frequently a left axis deviation due to these changes. There also may be nonspecific ST segment and T-wave changes.

Profound cardiac changes also occur during labor. Systemic vascular resistance can go up 10%–25% with each contraction. “That's a substantial increase for a patient with a very sick myocardium or those with significant valvular diseases,” Dr. Ivester said. Women in labor will autoinfuse 300–500 cc every time they contract, especially if they are near term. Cardiac output fluctuates as labor progresses. In early labor (<3 cm), cardiac output goes up about 17%. In the second stage of labor (> 8 cm), cardiac output increases at least 34%.

Publications
Publications
Topics
Article Type
Display Headline
Team-Based Approach Key to Care In Peripartum Cardiomyopathy
Display Headline
Team-Based Approach Key to Care In Peripartum Cardiomyopathy
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

T1-Weighted MRI Confirms Postdural Puncture Headache

Article Type
Changed
Display Headline
T1-Weighted MRI Confirms Postdural Puncture Headache

ASHEVILLE, N.C. — If you suspect a postdural puncture headache but aren't sure, order a T1-weighted MRI with gadolinium contrast for the patient, David C. Mayer, M.D., advised at the Southern Obstetric and Gynecologic Seminar.

“It used to be that there were no imaging studies available to make the diagnosis of postdural puncture headache. That has now changed,” said Dr. Mayer, a professor of obstetrics and gynecology and of anesthesiology at the University of North Carolina at Chapel Hill.

Signs of postdural puncture headaches (PDPH) cannot be seen on CT scans (with and without contrast) or noncontrast MRI.

MRI (T1 weighted) with gadolinium contrast, however, reveals changes that can make a difference in the diagnosis of PDPH. This particular type of MRI rules out more serious conditions, such as subdural hematoma and intracranial masses. The two key findings using T1-weighted contrast MRI are meningeal enhancement and descent or sagging of the brain. Diffuse meningeal enhancement is seen on the MRI. “The meninges … light up with gadolinium,” Dr. Mayer explained.

Less frequently, the pituitary may appear large—though this can be seen with CT as well—and engorged cerebral venous sinuses may also be seen.

Downward displacement of the brain can also be seen (similar to a Chiari malformation) with this type of imaging. There may also be descent of the cerebellar tonsils, obliteration of prepontine, perichiasmatic cisterns, flattening of the optic chiasm, crowding of the posterior fossa, as well as decreased ventricular size, according to Dr. Mayer.

PDPH onset commonly occurs while the patient is in the hospital. The headache usually has a postural component—worsening on standing and decreasing in a prone position. Other common symptoms include neck pain, nausea and vomiting, changes in hearing, and visual blurring or field cuts. However, atypical symptoms include interscapular pain, low-back pain, face numbness or weakness, galactorrhea, and radicular upper-limb symptoms.

“What people are now learning is that it is not just a pressure problem, it's a volume problem,” Dr. Mayer said.

CSF volume is a very well-regulated system. When volume changes occur, the system compensates. Intracranial veins dilate to maintain intracranial volume. Extensive venodilation may exert pressure on pain-sensitive structures (such as the meninges). The pituitary may enlarge. Brain sag—possibly as a result of reduced CSF pressure/volume—can compress and stretch structures and veins in the brain, leading to an increased risk of subdural hematoma.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

ASHEVILLE, N.C. — If you suspect a postdural puncture headache but aren't sure, order a T1-weighted MRI with gadolinium contrast for the patient, David C. Mayer, M.D., advised at the Southern Obstetric and Gynecologic Seminar.

“It used to be that there were no imaging studies available to make the diagnosis of postdural puncture headache. That has now changed,” said Dr. Mayer, a professor of obstetrics and gynecology and of anesthesiology at the University of North Carolina at Chapel Hill.

Signs of postdural puncture headaches (PDPH) cannot be seen on CT scans (with and without contrast) or noncontrast MRI.

MRI (T1 weighted) with gadolinium contrast, however, reveals changes that can make a difference in the diagnosis of PDPH. This particular type of MRI rules out more serious conditions, such as subdural hematoma and intracranial masses. The two key findings using T1-weighted contrast MRI are meningeal enhancement and descent or sagging of the brain. Diffuse meningeal enhancement is seen on the MRI. “The meninges … light up with gadolinium,” Dr. Mayer explained.

Less frequently, the pituitary may appear large—though this can be seen with CT as well—and engorged cerebral venous sinuses may also be seen.

Downward displacement of the brain can also be seen (similar to a Chiari malformation) with this type of imaging. There may also be descent of the cerebellar tonsils, obliteration of prepontine, perichiasmatic cisterns, flattening of the optic chiasm, crowding of the posterior fossa, as well as decreased ventricular size, according to Dr. Mayer.

PDPH onset commonly occurs while the patient is in the hospital. The headache usually has a postural component—worsening on standing and decreasing in a prone position. Other common symptoms include neck pain, nausea and vomiting, changes in hearing, and visual blurring or field cuts. However, atypical symptoms include interscapular pain, low-back pain, face numbness or weakness, galactorrhea, and radicular upper-limb symptoms.

“What people are now learning is that it is not just a pressure problem, it's a volume problem,” Dr. Mayer said.

CSF volume is a very well-regulated system. When volume changes occur, the system compensates. Intracranial veins dilate to maintain intracranial volume. Extensive venodilation may exert pressure on pain-sensitive structures (such as the meninges). The pituitary may enlarge. Brain sag—possibly as a result of reduced CSF pressure/volume—can compress and stretch structures and veins in the brain, leading to an increased risk of subdural hematoma.

ASHEVILLE, N.C. — If you suspect a postdural puncture headache but aren't sure, order a T1-weighted MRI with gadolinium contrast for the patient, David C. Mayer, M.D., advised at the Southern Obstetric and Gynecologic Seminar.

“It used to be that there were no imaging studies available to make the diagnosis of postdural puncture headache. That has now changed,” said Dr. Mayer, a professor of obstetrics and gynecology and of anesthesiology at the University of North Carolina at Chapel Hill.

Signs of postdural puncture headaches (PDPH) cannot be seen on CT scans (with and without contrast) or noncontrast MRI.

MRI (T1 weighted) with gadolinium contrast, however, reveals changes that can make a difference in the diagnosis of PDPH. This particular type of MRI rules out more serious conditions, such as subdural hematoma and intracranial masses. The two key findings using T1-weighted contrast MRI are meningeal enhancement and descent or sagging of the brain. Diffuse meningeal enhancement is seen on the MRI. “The meninges … light up with gadolinium,” Dr. Mayer explained.

Less frequently, the pituitary may appear large—though this can be seen with CT as well—and engorged cerebral venous sinuses may also be seen.

Downward displacement of the brain can also be seen (similar to a Chiari malformation) with this type of imaging. There may also be descent of the cerebellar tonsils, obliteration of prepontine, perichiasmatic cisterns, flattening of the optic chiasm, crowding of the posterior fossa, as well as decreased ventricular size, according to Dr. Mayer.

PDPH onset commonly occurs while the patient is in the hospital. The headache usually has a postural component—worsening on standing and decreasing in a prone position. Other common symptoms include neck pain, nausea and vomiting, changes in hearing, and visual blurring or field cuts. However, atypical symptoms include interscapular pain, low-back pain, face numbness or weakness, galactorrhea, and radicular upper-limb symptoms.

“What people are now learning is that it is not just a pressure problem, it's a volume problem,” Dr. Mayer said.

CSF volume is a very well-regulated system. When volume changes occur, the system compensates. Intracranial veins dilate to maintain intracranial volume. Extensive venodilation may exert pressure on pain-sensitive structures (such as the meninges). The pituitary may enlarge. Brain sag—possibly as a result of reduced CSF pressure/volume—can compress and stretch structures and veins in the brain, leading to an increased risk of subdural hematoma.

Publications
Publications
Topics
Article Type
Display Headline
T1-Weighted MRI Confirms Postdural Puncture Headache
Display Headline
T1-Weighted MRI Confirms Postdural Puncture Headache
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

High Blood Pressure Found in Almost Half of Obese Children

Article Type
Changed
Display Headline
High Blood Pressure Found in Almost Half of Obese Children

WASHINGTON — Almost half of the obese children presenting to one behavioral weight control program had blood pressure in the hypertensive or prehypertensive range, according to data presented at the annual meeting of the Pediatric Academic Societies.

Using new National Heart, Lung, and Blood Institute criteria for the diagnosis, evaluation, and treatment of high blood pressure, 29.2% of the 168 children involved in this study were hypertensive and 14.3% were prehypertensive, Monique Higginbotham, M.D., of Children's Hospital of Pittsburgh, said in a poster presentation at the meeting, which was sponsored by the American Pediatric Society, the Society for Pediatric Research, the Ambulatory Pediatric Association, and the American Academy of Pediatrics.

The cross-sectional study enrolled children aged 8–12 years with a mean body mass index (BMI) of 36 kg/m

At initial screening, the children were evaluated for height and weight, and their BMI was calculated. Blood pressure was measured three times at 5-minute intervals during a single visit, using a mercury sphygmomanometer. The mean of the three measurements was used as the final blood pressure.

According to the NHLBI guidelines, hypertension in children is defined as systolic or diastolic blood pressure levels that are at or above the 95th percentile for gender, age, and height. Prehypertension in children is defined as systolic or diastolic blood pressure levels that are at or above the 90th percentile but less than the 95th percentile. Normotensive children have systolic or diastolic blood pressure levels less than the 90th percentile.

Systolic blood pressure correlated positively with BMI. The prevalence of prehypertension and hypertension did not differ statistically by age, gender, or race/ethnicity.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

WASHINGTON — Almost half of the obese children presenting to one behavioral weight control program had blood pressure in the hypertensive or prehypertensive range, according to data presented at the annual meeting of the Pediatric Academic Societies.

Using new National Heart, Lung, and Blood Institute criteria for the diagnosis, evaluation, and treatment of high blood pressure, 29.2% of the 168 children involved in this study were hypertensive and 14.3% were prehypertensive, Monique Higginbotham, M.D., of Children's Hospital of Pittsburgh, said in a poster presentation at the meeting, which was sponsored by the American Pediatric Society, the Society for Pediatric Research, the Ambulatory Pediatric Association, and the American Academy of Pediatrics.

The cross-sectional study enrolled children aged 8–12 years with a mean body mass index (BMI) of 36 kg/m

At initial screening, the children were evaluated for height and weight, and their BMI was calculated. Blood pressure was measured three times at 5-minute intervals during a single visit, using a mercury sphygmomanometer. The mean of the three measurements was used as the final blood pressure.

According to the NHLBI guidelines, hypertension in children is defined as systolic or diastolic blood pressure levels that are at or above the 95th percentile for gender, age, and height. Prehypertension in children is defined as systolic or diastolic blood pressure levels that are at or above the 90th percentile but less than the 95th percentile. Normotensive children have systolic or diastolic blood pressure levels less than the 90th percentile.

Systolic blood pressure correlated positively with BMI. The prevalence of prehypertension and hypertension did not differ statistically by age, gender, or race/ethnicity.

WASHINGTON — Almost half of the obese children presenting to one behavioral weight control program had blood pressure in the hypertensive or prehypertensive range, according to data presented at the annual meeting of the Pediatric Academic Societies.

Using new National Heart, Lung, and Blood Institute criteria for the diagnosis, evaluation, and treatment of high blood pressure, 29.2% of the 168 children involved in this study were hypertensive and 14.3% were prehypertensive, Monique Higginbotham, M.D., of Children's Hospital of Pittsburgh, said in a poster presentation at the meeting, which was sponsored by the American Pediatric Society, the Society for Pediatric Research, the Ambulatory Pediatric Association, and the American Academy of Pediatrics.

The cross-sectional study enrolled children aged 8–12 years with a mean body mass index (BMI) of 36 kg/m

At initial screening, the children were evaluated for height and weight, and their BMI was calculated. Blood pressure was measured three times at 5-minute intervals during a single visit, using a mercury sphygmomanometer. The mean of the three measurements was used as the final blood pressure.

According to the NHLBI guidelines, hypertension in children is defined as systolic or diastolic blood pressure levels that are at or above the 95th percentile for gender, age, and height. Prehypertension in children is defined as systolic or diastolic blood pressure levels that are at or above the 90th percentile but less than the 95th percentile. Normotensive children have systolic or diastolic blood pressure levels less than the 90th percentile.

Systolic blood pressure correlated positively with BMI. The prevalence of prehypertension and hypertension did not differ statistically by age, gender, or race/ethnicity.

Publications
Publications
Topics
Article Type
Display Headline
High Blood Pressure Found in Almost Half of Obese Children
Display Headline
High Blood Pressure Found in Almost Half of Obese Children
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Desensitization Offers Hope to Gout Patients Allergic to Allopurinol

Article Type
Changed
Display Headline
Desensitization Offers Hope to Gout Patients Allergic to Allopurinol

DESTIN, FLA. — Oral desensitization appears to be a safe and effective alternative for patients who are allergic to allopurinol and who cannot take other urate-lowering drugs for gout, Adel G. Fam, M.D., reported at a rheumatology meeting sponsored by Virginia Commonwealth University.

Although 1%–3% of patients experience a pruritic maculopapular rash in response to allopurinol, severe allopurinol hypersensitivity syndrome (AHS) occurs in only about 0.4% of patients, said Dr. Fam, a professor of rheumatology at the University of Toronto.

Dr. Fam suggested that allopurinol desensitization be considered in gout patients with any of the following circumstances:

▸ Renal impairment, which renders uricosuric drugs ineffective.

▸ Underexcretion hyperuricemia; and allergy, intolerance, or contraindications to both probenecid and sulfinpyrazone.

▸ Overproduction/overexcretion hyperuricemia, which—when coupled with uricosurics—can increase the risk of renal stones.

▸ History of transplantation, renal insufficiency, and severe and debilitating gout.

▸ The patient requires prevention of malignancy-associated hyperuricemia and tumor lysis syndrome due to cytolytic therapy for hematologic malignancies; the resulting massive uricosuria precludes the use of uricosuric drugs.

The standard allopurinol desensitization protocol starts patients at a 50-mcg dose of allopurinol in suspension. The dose is gradually increased at 3-day intervals up to a target dose of 50–100 mg/day (in tablet form). The dosage can be adjusted if a rash occurs, Dr. Fam said at the meeting, also sponsored by the International Society for Clinical Densitometry.

For high-risk patients, such as the elderly, who have multiple concomitant medical conditions, more severe rash, or eosinophilia, a modified protocol is recommended. This protocol begins with allopurinol, 10 mcg or 25 mcg, in suspension. The dosage is titrated every 5–10 days.

In a retrospective study of 32 patients, 78% were able to tolerate long-term allopurinol therapy following desensitization (Arthritis Rheum. 2001;44:231–8).

The diagnostic criteria for AHS includes a definite history of exposure to allopurinol, lack of exposure to another drug that may have caused similar symptoms, and the fulfillment of either two major criteria or one major and one minor criterion. Major criteria include worsening renal function, acute hepatocellular injury, and rash (toxic epidermal necrosis, erythema multiforme, diffuse maculopapular rash, or exfoliative dermatitis). Minor criteria include fever, eosinophilia, and leukocytosis.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

DESTIN, FLA. — Oral desensitization appears to be a safe and effective alternative for patients who are allergic to allopurinol and who cannot take other urate-lowering drugs for gout, Adel G. Fam, M.D., reported at a rheumatology meeting sponsored by Virginia Commonwealth University.

Although 1%–3% of patients experience a pruritic maculopapular rash in response to allopurinol, severe allopurinol hypersensitivity syndrome (AHS) occurs in only about 0.4% of patients, said Dr. Fam, a professor of rheumatology at the University of Toronto.

Dr. Fam suggested that allopurinol desensitization be considered in gout patients with any of the following circumstances:

▸ Renal impairment, which renders uricosuric drugs ineffective.

▸ Underexcretion hyperuricemia; and allergy, intolerance, or contraindications to both probenecid and sulfinpyrazone.

▸ Overproduction/overexcretion hyperuricemia, which—when coupled with uricosurics—can increase the risk of renal stones.

▸ History of transplantation, renal insufficiency, and severe and debilitating gout.

▸ The patient requires prevention of malignancy-associated hyperuricemia and tumor lysis syndrome due to cytolytic therapy for hematologic malignancies; the resulting massive uricosuria precludes the use of uricosuric drugs.

The standard allopurinol desensitization protocol starts patients at a 50-mcg dose of allopurinol in suspension. The dose is gradually increased at 3-day intervals up to a target dose of 50–100 mg/day (in tablet form). The dosage can be adjusted if a rash occurs, Dr. Fam said at the meeting, also sponsored by the International Society for Clinical Densitometry.

For high-risk patients, such as the elderly, who have multiple concomitant medical conditions, more severe rash, or eosinophilia, a modified protocol is recommended. This protocol begins with allopurinol, 10 mcg or 25 mcg, in suspension. The dosage is titrated every 5–10 days.

In a retrospective study of 32 patients, 78% were able to tolerate long-term allopurinol therapy following desensitization (Arthritis Rheum. 2001;44:231–8).

The diagnostic criteria for AHS includes a definite history of exposure to allopurinol, lack of exposure to another drug that may have caused similar symptoms, and the fulfillment of either two major criteria or one major and one minor criterion. Major criteria include worsening renal function, acute hepatocellular injury, and rash (toxic epidermal necrosis, erythema multiforme, diffuse maculopapular rash, or exfoliative dermatitis). Minor criteria include fever, eosinophilia, and leukocytosis.

DESTIN, FLA. — Oral desensitization appears to be a safe and effective alternative for patients who are allergic to allopurinol and who cannot take other urate-lowering drugs for gout, Adel G. Fam, M.D., reported at a rheumatology meeting sponsored by Virginia Commonwealth University.

Although 1%–3% of patients experience a pruritic maculopapular rash in response to allopurinol, severe allopurinol hypersensitivity syndrome (AHS) occurs in only about 0.4% of patients, said Dr. Fam, a professor of rheumatology at the University of Toronto.

Dr. Fam suggested that allopurinol desensitization be considered in gout patients with any of the following circumstances:

▸ Renal impairment, which renders uricosuric drugs ineffective.

▸ Underexcretion hyperuricemia; and allergy, intolerance, or contraindications to both probenecid and sulfinpyrazone.

▸ Overproduction/overexcretion hyperuricemia, which—when coupled with uricosurics—can increase the risk of renal stones.

▸ History of transplantation, renal insufficiency, and severe and debilitating gout.

▸ The patient requires prevention of malignancy-associated hyperuricemia and tumor lysis syndrome due to cytolytic therapy for hematologic malignancies; the resulting massive uricosuria precludes the use of uricosuric drugs.

The standard allopurinol desensitization protocol starts patients at a 50-mcg dose of allopurinol in suspension. The dose is gradually increased at 3-day intervals up to a target dose of 50–100 mg/day (in tablet form). The dosage can be adjusted if a rash occurs, Dr. Fam said at the meeting, also sponsored by the International Society for Clinical Densitometry.

For high-risk patients, such as the elderly, who have multiple concomitant medical conditions, more severe rash, or eosinophilia, a modified protocol is recommended. This protocol begins with allopurinol, 10 mcg or 25 mcg, in suspension. The dosage is titrated every 5–10 days.

In a retrospective study of 32 patients, 78% were able to tolerate long-term allopurinol therapy following desensitization (Arthritis Rheum. 2001;44:231–8).

The diagnostic criteria for AHS includes a definite history of exposure to allopurinol, lack of exposure to another drug that may have caused similar symptoms, and the fulfillment of either two major criteria or one major and one minor criterion. Major criteria include worsening renal function, acute hepatocellular injury, and rash (toxic epidermal necrosis, erythema multiforme, diffuse maculopapular rash, or exfoliative dermatitis). Minor criteria include fever, eosinophilia, and leukocytosis.

Publications
Publications
Topics
Article Type
Display Headline
Desensitization Offers Hope to Gout Patients Allergic to Allopurinol
Display Headline
Desensitization Offers Hope to Gout Patients Allergic to Allopurinol
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Imaging Breakthroughs Reveal Early AD Changes

Article Type
Changed
Display Headline
Imaging Breakthroughs Reveal Early AD Changes

WASHINGTON – Imaging techniques designed to enable identification of preclinical Alzheimer's disease were showcased in numerous presentations at an international conference sponsored by the Alzheimer's Association.

Preclinical Biochemical Changes

Using magnetic resonance spectroscopy (MRS), researchers in the United Kingdom identified biochemical changes in the posterior cingulate in a group of symptom-free subjects genetically destined to develop Alzheimer's disease (AD).

The researchers imaged seven volunteers with familial AD, who carry presenilin 1 and amyloid precursor protein gene mutations and have an almost 100% chance of developing AD, according to Alison Godbolt, M.B., of the Dementia Research Centre at University College, London.

At the time of the study, these individuals had normal memory. Six healthy volunteers without familial AD were also recruited to serve as controls.

MRS provides information about select chemicals in a specific area of the brain that are involved in metabolism. MRS is performed using the same scanners as magnetic resonance imaging (MRI) and takes about 45 minutes.

The researchers looked at a single voxel along the midline of the posterior cingulate, a region that is known to be involved in AD. They measured the ratio of n-acetylaspartate to creatine and the ratio of myo-inositol to creatine.

Subjects with the genetic mutation had n-acetylaspartate/creatine ratios that were 10% lower and myo-inositol/creatine ratios that were 20% greater than those of the control group. The difference in the myo-inositol/creatine ratios between the two groups did not reach statistical significance. “Interestingly, other researchers have found the same changes in people who already have the disease,” Dr. Godbolt said.

Reduced levels of n-acetylaspartate are thought to be due to nerve cell dysfunction and loss; increased myo-inositol levels are thought to be due to increased inflammation. In addition, the volunteers with the gene mutation who were closest to their predicted age of onset had the most abnormal levels of these two chemicals, “suggesting a gradual buildup of changes over the several years before symptoms begin,” she said.

Screening via Hippocampal Size

Reduced hippocampal volume on MRI, combined with the results of the Mini Mental State Examination (MMSE), appears to do a better job of identifying patients with mild cognitive impairment (MCI) and AD than MMSE alone, according to a poster presented by Claire K. Sandstrom, who is a medical student at Duke University in Durham, N.C., and her colleagues.

Several recent studies have shown that individuals with MCI have smaller hippocampal volumes on MRI, compared with healthy controls. This is especially true for those subgroups of individuals with MCI, who later convert to AD.

The researchers evaluated 18 volunteers (11 men) with MCI and 17 volunteers (8 men) with normal cognition with the MMSE and MRI. Those with MCI were age 74 years on average and had a mean MMSE score of 27, while the control group was age 70 years on average and had a mean score of 28, Ms. Sandstrom told this newspaper.

Hippocampal atrophy was greater in the volunteers with MCI than in the controls. Left hippocampal volume was significantly smaller than on the right only in people with MCI.

The researchers developed receiver operating characteristic curves to evaluate the ability of left hippocampal volume, right hippocampal volume, MMSE score, and the combination of left hippocampal volume and MMSE score to accurately identify patients with AD and MCI. After analyzing these curves, the researchers concluded that left hippocampal volume was superior to MMSE alone in identifying patients with AD and MCI. “Left hippocampal volume added significantly to the discriminatory capacity of the MMSE scores for differentiating between cognitively normal individuals and those with MCI,” the authors wrote.

DTI Reveals Brain Changes in MCI

Researchers have identified changes in the left and right anterior hippocampus and amygdala in patients with MCI and in those with mild cognitive complaints but not in cognitively normal subjects, using diffusion-tensor imaging (DTI).

The researchers imaged 27 individuals with MCI (mean age 74 years), 25 individuals with cognitive complaints (mean age 73 years), and 33 healthy controls (mean age 72 years), according to a poster presented by John D. West of the Brain Imaging Lab at Dartmouth College in Hanover, N.H., and his colleagues.

The participants were drawn from the ongoing Dartmouth Memory and Aging Study. The groups were balanced for age, education, and sex. They also were assessed using the California Verbal Learning Test.

DTI reveals disruptions of the white matter tracts that are not visible on MRI. Within white matter, water moves parallel to tracts. Conventional MRI can distinguish white from gray matter but can provide very little detail about the white matter; MRI cannot observe or quantify specific fiber tract directions.

 

 

DTI relies on the principle that water diffusion is affected by the properties of the medium in which it occurs. Diffusion within biologic tissues reflects tissue structure and architecture at the microscopic level.

In particular, the researchers looked at the ability of water to diffuse in different regions of the brain. The greater the diffusion, also known as trace diffusivity, the less white matter structure there is to limit movement–an indication of white matter degeneration.

An area of increased trace diffusivity–relative to the control group–was found in the right posterior cingulate of both the MCI group and the group with lesser cognitive complaints. The participants with MCI also showed increased trace diffusivity in medial temporal regions relative to the control group.

Relative to controls, patients with MCI were more likely to have increased trace diffusivity in the left and right anterior hippocampus and amygdala. Cerebral water diffusion in the group with lesser cognitive complaints was less than in those with MCI and greater than cognitively normal controls.

Dr. West and his associates correlated trace diffusivity with performance They found that decreasing verbal scores on the California Verbal Learning Test correlated with increasing trace diffusivity in the left and right anterior hippocampus and amygdala.

The findings suggest that DTI could be sensitive to preclinical changes in regions of the brain associated with AD.

AD vs. Lewy Body Dementia on PET

PET imaging shows that patients with dementia with Lewy bodies (DLB) have slightly more β-amyloid in the occipital and sensorimotor cortex than do patients with AD, a finding that may help physicians distinguish the two conditions with similar symptoms, according to a poster that was presented by Victor L. Villemagne, M.D., of Austin Hospital in Melbourne, Australia.

The researchers took advantage of a relatively new PET tracer–the Pittsburgh Compound-B (PIB)–to image β-amyloid in the brain. PIB is a derivative of thiamine that is labeled with radioactive carbon and attaches to β-amyloid deposits in the brain that show up on PET imaging.

The researchers imaged eight patients with AD, seven patients with DLB, and seven age-matched healthy controls using PIB PET and

PIB PET images of the patients with AD showed marked binding in the frontal, parietal, and lateral temporal cortices, as well as the caudate nuclei, suggesting that there were significant β-amyloid deposits there. There was relative sparing of the occipital and sensorimotor cortex and very low uptake in the cerebellar cortex. Patients with DLB appeared similar to those with AD but slightly higher uptake was noted in the occipital and sensorimotor cortex.

The normal control group showed little or no PIB retention in any cortical and subcortical gray matter areas. Areas of PIB binding were inversely correlated with areas of FDG uptake–a measure of brain activity.

The use of PIB binding patterns can distinguish DLB from AD, said Dr. Villemagne, also of the department of pathology at the University of Melbourne, in an interview.

DLB is the second most common cause of dementia after AD, and it is difficult to distinguish the two disorders. Postmortem studies of DLB have shown that the majority of patients have cortical β-amyloid deposits similar to those found in patients with AD.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

WASHINGTON – Imaging techniques designed to enable identification of preclinical Alzheimer's disease were showcased in numerous presentations at an international conference sponsored by the Alzheimer's Association.

Preclinical Biochemical Changes

Using magnetic resonance spectroscopy (MRS), researchers in the United Kingdom identified biochemical changes in the posterior cingulate in a group of symptom-free subjects genetically destined to develop Alzheimer's disease (AD).

The researchers imaged seven volunteers with familial AD, who carry presenilin 1 and amyloid precursor protein gene mutations and have an almost 100% chance of developing AD, according to Alison Godbolt, M.B., of the Dementia Research Centre at University College, London.

At the time of the study, these individuals had normal memory. Six healthy volunteers without familial AD were also recruited to serve as controls.

MRS provides information about select chemicals in a specific area of the brain that are involved in metabolism. MRS is performed using the same scanners as magnetic resonance imaging (MRI) and takes about 45 minutes.

The researchers looked at a single voxel along the midline of the posterior cingulate, a region that is known to be involved in AD. They measured the ratio of n-acetylaspartate to creatine and the ratio of myo-inositol to creatine.

Subjects with the genetic mutation had n-acetylaspartate/creatine ratios that were 10% lower and myo-inositol/creatine ratios that were 20% greater than those of the control group. The difference in the myo-inositol/creatine ratios between the two groups did not reach statistical significance. “Interestingly, other researchers have found the same changes in people who already have the disease,” Dr. Godbolt said.

Reduced levels of n-acetylaspartate are thought to be due to nerve cell dysfunction and loss; increased myo-inositol levels are thought to be due to increased inflammation. In addition, the volunteers with the gene mutation who were closest to their predicted age of onset had the most abnormal levels of these two chemicals, “suggesting a gradual buildup of changes over the several years before symptoms begin,” she said.

Screening via Hippocampal Size

Reduced hippocampal volume on MRI, combined with the results of the Mini Mental State Examination (MMSE), appears to do a better job of identifying patients with mild cognitive impairment (MCI) and AD than MMSE alone, according to a poster presented by Claire K. Sandstrom, who is a medical student at Duke University in Durham, N.C., and her colleagues.

Several recent studies have shown that individuals with MCI have smaller hippocampal volumes on MRI, compared with healthy controls. This is especially true for those subgroups of individuals with MCI, who later convert to AD.

The researchers evaluated 18 volunteers (11 men) with MCI and 17 volunteers (8 men) with normal cognition with the MMSE and MRI. Those with MCI were age 74 years on average and had a mean MMSE score of 27, while the control group was age 70 years on average and had a mean score of 28, Ms. Sandstrom told this newspaper.

Hippocampal atrophy was greater in the volunteers with MCI than in the controls. Left hippocampal volume was significantly smaller than on the right only in people with MCI.

The researchers developed receiver operating characteristic curves to evaluate the ability of left hippocampal volume, right hippocampal volume, MMSE score, and the combination of left hippocampal volume and MMSE score to accurately identify patients with AD and MCI. After analyzing these curves, the researchers concluded that left hippocampal volume was superior to MMSE alone in identifying patients with AD and MCI. “Left hippocampal volume added significantly to the discriminatory capacity of the MMSE scores for differentiating between cognitively normal individuals and those with MCI,” the authors wrote.

DTI Reveals Brain Changes in MCI

Researchers have identified changes in the left and right anterior hippocampus and amygdala in patients with MCI and in those with mild cognitive complaints but not in cognitively normal subjects, using diffusion-tensor imaging (DTI).

The researchers imaged 27 individuals with MCI (mean age 74 years), 25 individuals with cognitive complaints (mean age 73 years), and 33 healthy controls (mean age 72 years), according to a poster presented by John D. West of the Brain Imaging Lab at Dartmouth College in Hanover, N.H., and his colleagues.

The participants were drawn from the ongoing Dartmouth Memory and Aging Study. The groups were balanced for age, education, and sex. They also were assessed using the California Verbal Learning Test.

DTI reveals disruptions of the white matter tracts that are not visible on MRI. Within white matter, water moves parallel to tracts. Conventional MRI can distinguish white from gray matter but can provide very little detail about the white matter; MRI cannot observe or quantify specific fiber tract directions.

 

 

DTI relies on the principle that water diffusion is affected by the properties of the medium in which it occurs. Diffusion within biologic tissues reflects tissue structure and architecture at the microscopic level.

In particular, the researchers looked at the ability of water to diffuse in different regions of the brain. The greater the diffusion, also known as trace diffusivity, the less white matter structure there is to limit movement–an indication of white matter degeneration.

An area of increased trace diffusivity–relative to the control group–was found in the right posterior cingulate of both the MCI group and the group with lesser cognitive complaints. The participants with MCI also showed increased trace diffusivity in medial temporal regions relative to the control group.

Relative to controls, patients with MCI were more likely to have increased trace diffusivity in the left and right anterior hippocampus and amygdala. Cerebral water diffusion in the group with lesser cognitive complaints was less than in those with MCI and greater than cognitively normal controls.

Dr. West and his associates correlated trace diffusivity with performance They found that decreasing verbal scores on the California Verbal Learning Test correlated with increasing trace diffusivity in the left and right anterior hippocampus and amygdala.

The findings suggest that DTI could be sensitive to preclinical changes in regions of the brain associated with AD.

AD vs. Lewy Body Dementia on PET

PET imaging shows that patients with dementia with Lewy bodies (DLB) have slightly more β-amyloid in the occipital and sensorimotor cortex than do patients with AD, a finding that may help physicians distinguish the two conditions with similar symptoms, according to a poster that was presented by Victor L. Villemagne, M.D., of Austin Hospital in Melbourne, Australia.

The researchers took advantage of a relatively new PET tracer–the Pittsburgh Compound-B (PIB)–to image β-amyloid in the brain. PIB is a derivative of thiamine that is labeled with radioactive carbon and attaches to β-amyloid deposits in the brain that show up on PET imaging.

The researchers imaged eight patients with AD, seven patients with DLB, and seven age-matched healthy controls using PIB PET and

PIB PET images of the patients with AD showed marked binding in the frontal, parietal, and lateral temporal cortices, as well as the caudate nuclei, suggesting that there were significant β-amyloid deposits there. There was relative sparing of the occipital and sensorimotor cortex and very low uptake in the cerebellar cortex. Patients with DLB appeared similar to those with AD but slightly higher uptake was noted in the occipital and sensorimotor cortex.

The normal control group showed little or no PIB retention in any cortical and subcortical gray matter areas. Areas of PIB binding were inversely correlated with areas of FDG uptake–a measure of brain activity.

The use of PIB binding patterns can distinguish DLB from AD, said Dr. Villemagne, also of the department of pathology at the University of Melbourne, in an interview.

DLB is the second most common cause of dementia after AD, and it is difficult to distinguish the two disorders. Postmortem studies of DLB have shown that the majority of patients have cortical β-amyloid deposits similar to those found in patients with AD.

WASHINGTON – Imaging techniques designed to enable identification of preclinical Alzheimer's disease were showcased in numerous presentations at an international conference sponsored by the Alzheimer's Association.

Preclinical Biochemical Changes

Using magnetic resonance spectroscopy (MRS), researchers in the United Kingdom identified biochemical changes in the posterior cingulate in a group of symptom-free subjects genetically destined to develop Alzheimer's disease (AD).

The researchers imaged seven volunteers with familial AD, who carry presenilin 1 and amyloid precursor protein gene mutations and have an almost 100% chance of developing AD, according to Alison Godbolt, M.B., of the Dementia Research Centre at University College, London.

At the time of the study, these individuals had normal memory. Six healthy volunteers without familial AD were also recruited to serve as controls.

MRS provides information about select chemicals in a specific area of the brain that are involved in metabolism. MRS is performed using the same scanners as magnetic resonance imaging (MRI) and takes about 45 minutes.

The researchers looked at a single voxel along the midline of the posterior cingulate, a region that is known to be involved in AD. They measured the ratio of n-acetylaspartate to creatine and the ratio of myo-inositol to creatine.

Subjects with the genetic mutation had n-acetylaspartate/creatine ratios that were 10% lower and myo-inositol/creatine ratios that were 20% greater than those of the control group. The difference in the myo-inositol/creatine ratios between the two groups did not reach statistical significance. “Interestingly, other researchers have found the same changes in people who already have the disease,” Dr. Godbolt said.

Reduced levels of n-acetylaspartate are thought to be due to nerve cell dysfunction and loss; increased myo-inositol levels are thought to be due to increased inflammation. In addition, the volunteers with the gene mutation who were closest to their predicted age of onset had the most abnormal levels of these two chemicals, “suggesting a gradual buildup of changes over the several years before symptoms begin,” she said.

Screening via Hippocampal Size

Reduced hippocampal volume on MRI, combined with the results of the Mini Mental State Examination (MMSE), appears to do a better job of identifying patients with mild cognitive impairment (MCI) and AD than MMSE alone, according to a poster presented by Claire K. Sandstrom, who is a medical student at Duke University in Durham, N.C., and her colleagues.

Several recent studies have shown that individuals with MCI have smaller hippocampal volumes on MRI, compared with healthy controls. This is especially true for those subgroups of individuals with MCI, who later convert to AD.

The researchers evaluated 18 volunteers (11 men) with MCI and 17 volunteers (8 men) with normal cognition with the MMSE and MRI. Those with MCI were age 74 years on average and had a mean MMSE score of 27, while the control group was age 70 years on average and had a mean score of 28, Ms. Sandstrom told this newspaper.

Hippocampal atrophy was greater in the volunteers with MCI than in the controls. Left hippocampal volume was significantly smaller than on the right only in people with MCI.

The researchers developed receiver operating characteristic curves to evaluate the ability of left hippocampal volume, right hippocampal volume, MMSE score, and the combination of left hippocampal volume and MMSE score to accurately identify patients with AD and MCI. After analyzing these curves, the researchers concluded that left hippocampal volume was superior to MMSE alone in identifying patients with AD and MCI. “Left hippocampal volume added significantly to the discriminatory capacity of the MMSE scores for differentiating between cognitively normal individuals and those with MCI,” the authors wrote.

DTI Reveals Brain Changes in MCI

Researchers have identified changes in the left and right anterior hippocampus and amygdala in patients with MCI and in those with mild cognitive complaints but not in cognitively normal subjects, using diffusion-tensor imaging (DTI).

The researchers imaged 27 individuals with MCI (mean age 74 years), 25 individuals with cognitive complaints (mean age 73 years), and 33 healthy controls (mean age 72 years), according to a poster presented by John D. West of the Brain Imaging Lab at Dartmouth College in Hanover, N.H., and his colleagues.

The participants were drawn from the ongoing Dartmouth Memory and Aging Study. The groups were balanced for age, education, and sex. They also were assessed using the California Verbal Learning Test.

DTI reveals disruptions of the white matter tracts that are not visible on MRI. Within white matter, water moves parallel to tracts. Conventional MRI can distinguish white from gray matter but can provide very little detail about the white matter; MRI cannot observe or quantify specific fiber tract directions.

 

 

DTI relies on the principle that water diffusion is affected by the properties of the medium in which it occurs. Diffusion within biologic tissues reflects tissue structure and architecture at the microscopic level.

In particular, the researchers looked at the ability of water to diffuse in different regions of the brain. The greater the diffusion, also known as trace diffusivity, the less white matter structure there is to limit movement–an indication of white matter degeneration.

An area of increased trace diffusivity–relative to the control group–was found in the right posterior cingulate of both the MCI group and the group with lesser cognitive complaints. The participants with MCI also showed increased trace diffusivity in medial temporal regions relative to the control group.

Relative to controls, patients with MCI were more likely to have increased trace diffusivity in the left and right anterior hippocampus and amygdala. Cerebral water diffusion in the group with lesser cognitive complaints was less than in those with MCI and greater than cognitively normal controls.

Dr. West and his associates correlated trace diffusivity with performance They found that decreasing verbal scores on the California Verbal Learning Test correlated with increasing trace diffusivity in the left and right anterior hippocampus and amygdala.

The findings suggest that DTI could be sensitive to preclinical changes in regions of the brain associated with AD.

AD vs. Lewy Body Dementia on PET

PET imaging shows that patients with dementia with Lewy bodies (DLB) have slightly more β-amyloid in the occipital and sensorimotor cortex than do patients with AD, a finding that may help physicians distinguish the two conditions with similar symptoms, according to a poster that was presented by Victor L. Villemagne, M.D., of Austin Hospital in Melbourne, Australia.

The researchers took advantage of a relatively new PET tracer–the Pittsburgh Compound-B (PIB)–to image β-amyloid in the brain. PIB is a derivative of thiamine that is labeled with radioactive carbon and attaches to β-amyloid deposits in the brain that show up on PET imaging.

The researchers imaged eight patients with AD, seven patients with DLB, and seven age-matched healthy controls using PIB PET and

PIB PET images of the patients with AD showed marked binding in the frontal, parietal, and lateral temporal cortices, as well as the caudate nuclei, suggesting that there were significant β-amyloid deposits there. There was relative sparing of the occipital and sensorimotor cortex and very low uptake in the cerebellar cortex. Patients with DLB appeared similar to those with AD but slightly higher uptake was noted in the occipital and sensorimotor cortex.

The normal control group showed little or no PIB retention in any cortical and subcortical gray matter areas. Areas of PIB binding were inversely correlated with areas of FDG uptake–a measure of brain activity.

The use of PIB binding patterns can distinguish DLB from AD, said Dr. Villemagne, also of the department of pathology at the University of Melbourne, in an interview.

DLB is the second most common cause of dementia after AD, and it is difficult to distinguish the two disorders. Postmortem studies of DLB have shown that the majority of patients have cortical β-amyloid deposits similar to those found in patients with AD.

Publications
Publications
Topics
Article Type
Display Headline
Imaging Breakthroughs Reveal Early AD Changes
Display Headline
Imaging Breakthroughs Reveal Early AD Changes
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

FDA: Vision Loss Reported With ED Drug Use

Article Type
Changed
Display Headline
FDA: Vision Loss Reported With ED Drug Use

The Food and Drug Administration has approved revised labeling for Cialis (tadalafil), Levitra (vardenafil), and Viagra (sildenafil) to reflect a small number of postmarketing reports of sudden vision loss.

The sudden loss of eyesight is attributed to nonarteritic ischemic optic neuropathy, in which blood flow is blocked to the optic nerve. It's not known if these erectile dysfunction medications cause this condition. People with a higher chance of developing this condition include those who:

▸ Have heart disease.

▸ Are older than 50 years.

▸ Have diabetes.

▸ Have high blood pressure.

▸ Have high cholesterol.

▸ Smoke.

▸ Have certain eye problems.

Patients taking these medications are advised to discontinue use and contact their physician immediately if they experience sudden vision loss or decreased vision in one or both eyes. In addition, patients taking or considering these products should inform their physician if they have ever had severe loss of vision, which could reflect a prior episode of nonarteritic ischemic optic neuropathy. Such patients have an increased risk of developing the condition again.

For additional information about labeling changes to Cialis, visit www.fda.gov/cder/drug/infopage/cialis/default.htm

For additional information about labeling changes to Levitra, visit www.fda.gov/cder/drug/infopage/vardenafil/default.htm

For additional information about labeling changes to Viagra, visit www.fda.gov/cder/consumerinfo/Viagra/Viagra.htm

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

The Food and Drug Administration has approved revised labeling for Cialis (tadalafil), Levitra (vardenafil), and Viagra (sildenafil) to reflect a small number of postmarketing reports of sudden vision loss.

The sudden loss of eyesight is attributed to nonarteritic ischemic optic neuropathy, in which blood flow is blocked to the optic nerve. It's not known if these erectile dysfunction medications cause this condition. People with a higher chance of developing this condition include those who:

▸ Have heart disease.

▸ Are older than 50 years.

▸ Have diabetes.

▸ Have high blood pressure.

▸ Have high cholesterol.

▸ Smoke.

▸ Have certain eye problems.

Patients taking these medications are advised to discontinue use and contact their physician immediately if they experience sudden vision loss or decreased vision in one or both eyes. In addition, patients taking or considering these products should inform their physician if they have ever had severe loss of vision, which could reflect a prior episode of nonarteritic ischemic optic neuropathy. Such patients have an increased risk of developing the condition again.

For additional information about labeling changes to Cialis, visit www.fda.gov/cder/drug/infopage/cialis/default.htm

For additional information about labeling changes to Levitra, visit www.fda.gov/cder/drug/infopage/vardenafil/default.htm

For additional information about labeling changes to Viagra, visit www.fda.gov/cder/consumerinfo/Viagra/Viagra.htm

The Food and Drug Administration has approved revised labeling for Cialis (tadalafil), Levitra (vardenafil), and Viagra (sildenafil) to reflect a small number of postmarketing reports of sudden vision loss.

The sudden loss of eyesight is attributed to nonarteritic ischemic optic neuropathy, in which blood flow is blocked to the optic nerve. It's not known if these erectile dysfunction medications cause this condition. People with a higher chance of developing this condition include those who:

▸ Have heart disease.

▸ Are older than 50 years.

▸ Have diabetes.

▸ Have high blood pressure.

▸ Have high cholesterol.

▸ Smoke.

▸ Have certain eye problems.

Patients taking these medications are advised to discontinue use and contact their physician immediately if they experience sudden vision loss or decreased vision in one or both eyes. In addition, patients taking or considering these products should inform their physician if they have ever had severe loss of vision, which could reflect a prior episode of nonarteritic ischemic optic neuropathy. Such patients have an increased risk of developing the condition again.

For additional information about labeling changes to Cialis, visit www.fda.gov/cder/drug/infopage/cialis/default.htm

For additional information about labeling changes to Levitra, visit www.fda.gov/cder/drug/infopage/vardenafil/default.htm

For additional information about labeling changes to Viagra, visit www.fda.gov/cder/consumerinfo/Viagra/Viagra.htm

Publications
Publications
Topics
Article Type
Display Headline
FDA: Vision Loss Reported With ED Drug Use
Display Headline
FDA: Vision Loss Reported With ED Drug Use
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Many Children Manage Their Own Asthma Meds

Article Type
Changed
Display Headline
Many Children Manage Their Own Asthma Meds

WASHINGTON — Half of children aged 7–9 years are primarily responsible for taking their asthma medication, according to the results of one study presented at the annual meeting of the Pediatric Academic Societies.

More than two-thirds of the surveyed children and parents reported that the child takes asthma medications on his or her own all or most of the time—68% of children and 66% of parents, said Lynn Olson, Ph.D., codirector of practice and research for the American Academy of Pediatrics in Elk Grove Village, Ill.

The data come from the Child Health Information Reporting Project. For the project, children ranging in age from 7 to 16 years were recruited in office and community settings in Chicago, its suburbs, and Cincinnati. A total of 414 parent-child pairs were included; parents and children were interviewed separately.

“We found that the agreement between parent and child was really pretty good … 80% agreeing within 1 percentage point,” including 40% agreeing exactly, Dr. Olson said.

African American children accounted for 46% of the population, with 40% white, 11% Hispanic, and the rest “other.” Forty-two percent of parents reported household incomes of less than $30,000 per year.

Fifty-three percent of parents reported that their child had moderate to severe asthma, with 3.2 mean symptom days reported in the last 2 weeks.

There was no relationship between the child taking responsibility for asthma medication and socioeconomic factors, such as income or the mother's level of education. Whether or not the parent had asthma was not associated with the child's responsibility for taking his or her asthma medication.

Children were more likely to be involved in managing their own asthma medication with increasing age. Among children aged 7–9 years, 56% of children and 50% of parents reported the child had a major role in managing his or her asthma, compared with 73% of children and 74% of parents among children aged 10–13 years, and 86% of children and 78% of parents among those aged 14–16 years.

The researchers also asked parents how often their child took asthma medication, as they should. More than a quarter of parents (28%) reported that their child did so only some or none of the time. Among these parents, 56% reported that the child was primarily responsible for taking his or her medication. Among parents who reported good medication adherence for their child, 71% said that the child was primarily responsible for taking the medication.

“The relationship between responsibility and adherence is complex,” said Dr. Olson, speaking at the meeting sponsored by the American Pediatric Society, the Society for Pediatric Research, the Ambulatory Pediatric Association, and the American Academy of Pediatrics.

“Relatively little attention has been given to the child's role in medication management. Our observation is that most interventions and education are directed toward the parent,” Dr. Olson said. These findings suggest that asthma management education should be targeted at children as well as parents.

Article PDF
Author and Disclosure Information

Publications
Topics
Sections
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

WASHINGTON — Half of children aged 7–9 years are primarily responsible for taking their asthma medication, according to the results of one study presented at the annual meeting of the Pediatric Academic Societies.

More than two-thirds of the surveyed children and parents reported that the child takes asthma medications on his or her own all or most of the time—68% of children and 66% of parents, said Lynn Olson, Ph.D., codirector of practice and research for the American Academy of Pediatrics in Elk Grove Village, Ill.

The data come from the Child Health Information Reporting Project. For the project, children ranging in age from 7 to 16 years were recruited in office and community settings in Chicago, its suburbs, and Cincinnati. A total of 414 parent-child pairs were included; parents and children were interviewed separately.

“We found that the agreement between parent and child was really pretty good … 80% agreeing within 1 percentage point,” including 40% agreeing exactly, Dr. Olson said.

African American children accounted for 46% of the population, with 40% white, 11% Hispanic, and the rest “other.” Forty-two percent of parents reported household incomes of less than $30,000 per year.

Fifty-three percent of parents reported that their child had moderate to severe asthma, with 3.2 mean symptom days reported in the last 2 weeks.

There was no relationship between the child taking responsibility for asthma medication and socioeconomic factors, such as income or the mother's level of education. Whether or not the parent had asthma was not associated with the child's responsibility for taking his or her asthma medication.

Children were more likely to be involved in managing their own asthma medication with increasing age. Among children aged 7–9 years, 56% of children and 50% of parents reported the child had a major role in managing his or her asthma, compared with 73% of children and 74% of parents among children aged 10–13 years, and 86% of children and 78% of parents among those aged 14–16 years.

The researchers also asked parents how often their child took asthma medication, as they should. More than a quarter of parents (28%) reported that their child did so only some or none of the time. Among these parents, 56% reported that the child was primarily responsible for taking his or her medication. Among parents who reported good medication adherence for their child, 71% said that the child was primarily responsible for taking the medication.

“The relationship between responsibility and adherence is complex,” said Dr. Olson, speaking at the meeting sponsored by the American Pediatric Society, the Society for Pediatric Research, the Ambulatory Pediatric Association, and the American Academy of Pediatrics.

“Relatively little attention has been given to the child's role in medication management. Our observation is that most interventions and education are directed toward the parent,” Dr. Olson said. These findings suggest that asthma management education should be targeted at children as well as parents.

WASHINGTON — Half of children aged 7–9 years are primarily responsible for taking their asthma medication, according to the results of one study presented at the annual meeting of the Pediatric Academic Societies.

More than two-thirds of the surveyed children and parents reported that the child takes asthma medications on his or her own all or most of the time—68% of children and 66% of parents, said Lynn Olson, Ph.D., codirector of practice and research for the American Academy of Pediatrics in Elk Grove Village, Ill.

The data come from the Child Health Information Reporting Project. For the project, children ranging in age from 7 to 16 years were recruited in office and community settings in Chicago, its suburbs, and Cincinnati. A total of 414 parent-child pairs were included; parents and children were interviewed separately.

“We found that the agreement between parent and child was really pretty good … 80% agreeing within 1 percentage point,” including 40% agreeing exactly, Dr. Olson said.

African American children accounted for 46% of the population, with 40% white, 11% Hispanic, and the rest “other.” Forty-two percent of parents reported household incomes of less than $30,000 per year.

Fifty-three percent of parents reported that their child had moderate to severe asthma, with 3.2 mean symptom days reported in the last 2 weeks.

There was no relationship between the child taking responsibility for asthma medication and socioeconomic factors, such as income or the mother's level of education. Whether or not the parent had asthma was not associated with the child's responsibility for taking his or her asthma medication.

Children were more likely to be involved in managing their own asthma medication with increasing age. Among children aged 7–9 years, 56% of children and 50% of parents reported the child had a major role in managing his or her asthma, compared with 73% of children and 74% of parents among children aged 10–13 years, and 86% of children and 78% of parents among those aged 14–16 years.

The researchers also asked parents how often their child took asthma medication, as they should. More than a quarter of parents (28%) reported that their child did so only some or none of the time. Among these parents, 56% reported that the child was primarily responsible for taking his or her medication. Among parents who reported good medication adherence for their child, 71% said that the child was primarily responsible for taking the medication.

“The relationship between responsibility and adherence is complex,” said Dr. Olson, speaking at the meeting sponsored by the American Pediatric Society, the Society for Pediatric Research, the Ambulatory Pediatric Association, and the American Academy of Pediatrics.

“Relatively little attention has been given to the child's role in medication management. Our observation is that most interventions and education are directed toward the parent,” Dr. Olson said. These findings suggest that asthma management education should be targeted at children as well as parents.

Publications
Publications
Topics
Article Type
Display Headline
Many Children Manage Their Own Asthma Meds
Display Headline
Many Children Manage Their Own Asthma Meds
Sections
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

High Parity Poses Greatest SIDS Risk in Offspring

Article Type
Changed
Display Headline
High Parity Poses Greatest SIDS Risk in Offspring

WASHINGTON — High parity has replaced preterm delivery as the greatest risk factor for SIDS, according to a study of national data, presented at the annual meeting of the Pediatric Academic Societies.

“Our highest single risk factor was high parity,” said Donna R. Halloran, M.D., of the University of Alabama, Birmingham. Mothers with a parity of five or greater were 3.6 times more likely to have an infant die of SIDS.

The shift follows an epidemiologic shift in SIDS deaths that occurred in the mid-1990s. In 1991, 1.2 cases of SIDS occurred for every 1,000 live births in the United States. By 1996, the number had dropped dramatically, to 0.7 cases for every 1,000 live births. In 2002, there were 0.6 cases for every 1,000 live births.

The decrease in SIDS deaths has been attributed to the National Institute of Child Health and Human Development's “Back to Sleep” educational campaign initiated in 1994. The number of parents putting their infants in a prone sleep position dropped dramatically. In 1992, 70% of infants were sleeping in a prone position, compared with 18% in 1996.

The study population included all singleton live births in the United States from 1996 to 1998. These data came from the National Center for Health Statistics birth cohort (with linked birth and death files). Infants were excluded if their gestation was less than 22 weeks or greater than 44 weeks. Multiple gestations also were excluded, as were infants of nonresident mothers.

The multivariate analysis model included maternal variables—race/ethnicity, education, age, marital status, smoking, alcohol use, diabetes, hypertension, and parity. The model also included infant gender, region of birth, fetal growth, and gestation. Fetal growth was defined as birth weight given the length of gestation: small (lower-10th percentile), appropriate, and large (upper-10th percentile).

A total of 8,199 deaths due to SIDS were identified for a rate of 0.72 deaths per 1,000 live births. High parity may have replaced preterm delivery as the greatest risk factor, but preterm birth still is a strong predictor of SIDS risk. Infants less than 32 weeks gestational age were three times more likely to die of SIDS, compared with those born at the gestational age of 40–41 weeks. The odds ratio for SIDS death increased as gestational age decreased.

This may be especially important because the preterm delivery rate has risen in the last 15 years. In 1990, 10.6% of infants born in the United States were preterm. In 2002, 12.1% of infants were born preterm—almost a half million infants per year.

“We found that preterm birth and fetal growth are actually independent risk factors for SIDS. This is a new finding in the United States,” said Dr. Halloran. Infants small for their gestational age were 1.7 times more likely to die of SIDS. Large size seemed to have a protective effect, with infants large for their gestational age 30% less likely to die of SIDS.

Hispanic infants were 50% less likely to die of SIDS, compared with non-Hispanic white infants. Non-Hispanic black and American Indian infants had a greater risk (OR 1.3 and 1.4, respectively). “Native Americans and non-Hispanic blacks actually have increasing odds ratios,” she said. It appears this may be due to the rate of SIDS deaths having dropped among non-Hispanic whites, resulting from the success of the “Back to Sleep” educational campaign.

Other traditional risk factors for SIDS are unchanged following this epidemiologic transition. Male infants were 1.5 times more likely to die of SIDS than females. Infants born to mothers with low education were 1.3 times more likely to die of SIDS; those born to mothers with higher education levels were 20% less likely to die of SIDS.

Infants born to mothers younger than 20 years of age were 1.7 times more likely to die of SIDS than those born to mothers in their 20s, and infants born to mothers older than 30 and older than 35 were both 50% less likely to die of SIDS. Infants born to unmarried mothers were 1.6 times more likely to die of SIDS than those born to married mothers. Infants born to women who smoked were 2.4 times more likely to die of SIDS than those born to nonsmokers.

The meeting also was sponsored by the American Pediatric Society, the Society for Pediatric Research, the Ambulatory Pediatric Association, and the American Academy of Pediatrics.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

WASHINGTON — High parity has replaced preterm delivery as the greatest risk factor for SIDS, according to a study of national data, presented at the annual meeting of the Pediatric Academic Societies.

“Our highest single risk factor was high parity,” said Donna R. Halloran, M.D., of the University of Alabama, Birmingham. Mothers with a parity of five or greater were 3.6 times more likely to have an infant die of SIDS.

The shift follows an epidemiologic shift in SIDS deaths that occurred in the mid-1990s. In 1991, 1.2 cases of SIDS occurred for every 1,000 live births in the United States. By 1996, the number had dropped dramatically, to 0.7 cases for every 1,000 live births. In 2002, there were 0.6 cases for every 1,000 live births.

The decrease in SIDS deaths has been attributed to the National Institute of Child Health and Human Development's “Back to Sleep” educational campaign initiated in 1994. The number of parents putting their infants in a prone sleep position dropped dramatically. In 1992, 70% of infants were sleeping in a prone position, compared with 18% in 1996.

The study population included all singleton live births in the United States from 1996 to 1998. These data came from the National Center for Health Statistics birth cohort (with linked birth and death files). Infants were excluded if their gestation was less than 22 weeks or greater than 44 weeks. Multiple gestations also were excluded, as were infants of nonresident mothers.

The multivariate analysis model included maternal variables—race/ethnicity, education, age, marital status, smoking, alcohol use, diabetes, hypertension, and parity. The model also included infant gender, region of birth, fetal growth, and gestation. Fetal growth was defined as birth weight given the length of gestation: small (lower-10th percentile), appropriate, and large (upper-10th percentile).

A total of 8,199 deaths due to SIDS were identified for a rate of 0.72 deaths per 1,000 live births. High parity may have replaced preterm delivery as the greatest risk factor, but preterm birth still is a strong predictor of SIDS risk. Infants less than 32 weeks gestational age were three times more likely to die of SIDS, compared with those born at the gestational age of 40–41 weeks. The odds ratio for SIDS death increased as gestational age decreased.

This may be especially important because the preterm delivery rate has risen in the last 15 years. In 1990, 10.6% of infants born in the United States were preterm. In 2002, 12.1% of infants were born preterm—almost a half million infants per year.

“We found that preterm birth and fetal growth are actually independent risk factors for SIDS. This is a new finding in the United States,” said Dr. Halloran. Infants small for their gestational age were 1.7 times more likely to die of SIDS. Large size seemed to have a protective effect, with infants large for their gestational age 30% less likely to die of SIDS.

Hispanic infants were 50% less likely to die of SIDS, compared with non-Hispanic white infants. Non-Hispanic black and American Indian infants had a greater risk (OR 1.3 and 1.4, respectively). “Native Americans and non-Hispanic blacks actually have increasing odds ratios,” she said. It appears this may be due to the rate of SIDS deaths having dropped among non-Hispanic whites, resulting from the success of the “Back to Sleep” educational campaign.

Other traditional risk factors for SIDS are unchanged following this epidemiologic transition. Male infants were 1.5 times more likely to die of SIDS than females. Infants born to mothers with low education were 1.3 times more likely to die of SIDS; those born to mothers with higher education levels were 20% less likely to die of SIDS.

Infants born to mothers younger than 20 years of age were 1.7 times more likely to die of SIDS than those born to mothers in their 20s, and infants born to mothers older than 30 and older than 35 were both 50% less likely to die of SIDS. Infants born to unmarried mothers were 1.6 times more likely to die of SIDS than those born to married mothers. Infants born to women who smoked were 2.4 times more likely to die of SIDS than those born to nonsmokers.

The meeting also was sponsored by the American Pediatric Society, the Society for Pediatric Research, the Ambulatory Pediatric Association, and the American Academy of Pediatrics.

WASHINGTON — High parity has replaced preterm delivery as the greatest risk factor for SIDS, according to a study of national data, presented at the annual meeting of the Pediatric Academic Societies.

“Our highest single risk factor was high parity,” said Donna R. Halloran, M.D., of the University of Alabama, Birmingham. Mothers with a parity of five or greater were 3.6 times more likely to have an infant die of SIDS.

The shift follows an epidemiologic shift in SIDS deaths that occurred in the mid-1990s. In 1991, 1.2 cases of SIDS occurred for every 1,000 live births in the United States. By 1996, the number had dropped dramatically, to 0.7 cases for every 1,000 live births. In 2002, there were 0.6 cases for every 1,000 live births.

The decrease in SIDS deaths has been attributed to the National Institute of Child Health and Human Development's “Back to Sleep” educational campaign initiated in 1994. The number of parents putting their infants in a prone sleep position dropped dramatically. In 1992, 70% of infants were sleeping in a prone position, compared with 18% in 1996.

The study population included all singleton live births in the United States from 1996 to 1998. These data came from the National Center for Health Statistics birth cohort (with linked birth and death files). Infants were excluded if their gestation was less than 22 weeks or greater than 44 weeks. Multiple gestations also were excluded, as were infants of nonresident mothers.

The multivariate analysis model included maternal variables—race/ethnicity, education, age, marital status, smoking, alcohol use, diabetes, hypertension, and parity. The model also included infant gender, region of birth, fetal growth, and gestation. Fetal growth was defined as birth weight given the length of gestation: small (lower-10th percentile), appropriate, and large (upper-10th percentile).

A total of 8,199 deaths due to SIDS were identified for a rate of 0.72 deaths per 1,000 live births. High parity may have replaced preterm delivery as the greatest risk factor, but preterm birth still is a strong predictor of SIDS risk. Infants less than 32 weeks gestational age were three times more likely to die of SIDS, compared with those born at the gestational age of 40–41 weeks. The odds ratio for SIDS death increased as gestational age decreased.

This may be especially important because the preterm delivery rate has risen in the last 15 years. In 1990, 10.6% of infants born in the United States were preterm. In 2002, 12.1% of infants were born preterm—almost a half million infants per year.

“We found that preterm birth and fetal growth are actually independent risk factors for SIDS. This is a new finding in the United States,” said Dr. Halloran. Infants small for their gestational age were 1.7 times more likely to die of SIDS. Large size seemed to have a protective effect, with infants large for their gestational age 30% less likely to die of SIDS.

Hispanic infants were 50% less likely to die of SIDS, compared with non-Hispanic white infants. Non-Hispanic black and American Indian infants had a greater risk (OR 1.3 and 1.4, respectively). “Native Americans and non-Hispanic blacks actually have increasing odds ratios,” she said. It appears this may be due to the rate of SIDS deaths having dropped among non-Hispanic whites, resulting from the success of the “Back to Sleep” educational campaign.

Other traditional risk factors for SIDS are unchanged following this epidemiologic transition. Male infants were 1.5 times more likely to die of SIDS than females. Infants born to mothers with low education were 1.3 times more likely to die of SIDS; those born to mothers with higher education levels were 20% less likely to die of SIDS.

Infants born to mothers younger than 20 years of age were 1.7 times more likely to die of SIDS than those born to mothers in their 20s, and infants born to mothers older than 30 and older than 35 were both 50% less likely to die of SIDS. Infants born to unmarried mothers were 1.6 times more likely to die of SIDS than those born to married mothers. Infants born to women who smoked were 2.4 times more likely to die of SIDS than those born to nonsmokers.

The meeting also was sponsored by the American Pediatric Society, the Society for Pediatric Research, the Ambulatory Pediatric Association, and the American Academy of Pediatrics.

Publications
Publications
Topics
Article Type
Display Headline
High Parity Poses Greatest SIDS Risk in Offspring
Display Headline
High Parity Poses Greatest SIDS Risk in Offspring
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Physician Adherence to Guidelines for ADHD Varies Widely

Article Type
Changed
Display Headline
Physician Adherence to Guidelines for ADHD Varies Widely

WASHINGTON — It appears that there is a wide range of adherence to the American Academy of Pediatrics guidelines on attention-deficit hyperactivity disorder, Wendy Davis, M.D., said at the annual meeting of the Pediatric Academic Societies.

“While [physicians] show a high level of confidence in prescribing and monitoring stimulant medications … few [physicians] in our study practiced in a manner that reflected understanding and documented use of targeted outcomes. Furthermore, our physicians expressed a lack of confidence in their ability to diagnose and treat attention-deficit hyperactivity coexisting conditions,” said Dr. Davis, a professor of pediatrics at the University of Vermont in Burlington.

In 2000, the AAP released clinical practice guidelines for the diagnosis and evaluation of the child with attention-deficit hyperactivity disorder (ADHD). Dr. Davis and her colleagues evaluated a group of pediatricians in Vermont for their adherence to the following selected recommendations from the guidelines:

▸ The clinician should recommend stimulant medication or behavior therapy as appropriate to improve targeted outcomes.

▸ The physician, parents, child, and school personnel should collaborate to identify targeted outcomes to guide management.

▸ The physician should periodically provide a systematic follow-up for the child with ADHD, and monitoring should be directed to targeted outcomes and adverse effects by obtaining specific information from parents, teachers, and the child.

▸ Evaluation of the child with ADHD should include assessment of possible coexisting conditions.

A total of 22 doctors in five pediatric practices—20% of practicing pediatricians in Vermont—participated. A self-administered pediatrician confidence survey served as a baseline measure. In this survey, pediatricians were asked to rate their confidence with various aspects of the diagnosis and treatment of ADHD.

In addition, a preintervention chart audit was conducted to assess adherence to AAP guidelines on several measures. The initial chart review included charts for all 5- to 15-year-old patients with a diagnosis of ADHD after 2001—a total of 225 (75% male patients).

In the survey, 89% of pediatricians responded that they were mostly or highly confident in starting patients on stimulant medication for the treatment of ADHD.

Based on the preintervention chart audit, 92% of charts indicated stimulants had been prescribed for the treatment of ADHD.

A total of 79% of pediatricians responded that they were mostly or highly confident in adjusting stimulant medication, and 72% of charts had evidence of dosage changes after the initial prescription.

Setting targeted outcomes proved to be more of a challenge for pediatricians, with 58% responding that they were mostly or highly confident in setting these, and 38% of charts had evidence of documented targeted outcomes.

Only 37% of pediatricians were mostly or highly confident in arranging for and coordinating nonpharmacologic treatment of ADHD.

However, 68% indicated that they communicated with school personnel most or almost all of the time.

According to the chart audit, parents were involved in treatment planning and monitoring 85% of the time. In addition, 77% of the charts had evidence of consultation with school personnel.

Based on the chart audit, adverse side effects were evaluated 86% of the time, though only 71% of charts had notations of the duration of effectiveness. Only 39% of charts indicated assessment of the adequacy of medication effectiveness.

Based on the survey, only a third (32%) of pediatricians were mostly or highly confident in identifying coexisting psychiatric conditions. And only 21% were mostly or highly confident in treating ADHD coexisting conditions.

Only 32% of charts had notations of coexisting conditions, Dr. Davis, said at the meeting, also sponsored by the American Pediatric Society, the Society for Pediatric Research, the Ambulatory Pediatric Association, and the American Academy of Pediatrics.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

WASHINGTON — It appears that there is a wide range of adherence to the American Academy of Pediatrics guidelines on attention-deficit hyperactivity disorder, Wendy Davis, M.D., said at the annual meeting of the Pediatric Academic Societies.

“While [physicians] show a high level of confidence in prescribing and monitoring stimulant medications … few [physicians] in our study practiced in a manner that reflected understanding and documented use of targeted outcomes. Furthermore, our physicians expressed a lack of confidence in their ability to diagnose and treat attention-deficit hyperactivity coexisting conditions,” said Dr. Davis, a professor of pediatrics at the University of Vermont in Burlington.

In 2000, the AAP released clinical practice guidelines for the diagnosis and evaluation of the child with attention-deficit hyperactivity disorder (ADHD). Dr. Davis and her colleagues evaluated a group of pediatricians in Vermont for their adherence to the following selected recommendations from the guidelines:

▸ The clinician should recommend stimulant medication or behavior therapy as appropriate to improve targeted outcomes.

▸ The physician, parents, child, and school personnel should collaborate to identify targeted outcomes to guide management.

▸ The physician should periodically provide a systematic follow-up for the child with ADHD, and monitoring should be directed to targeted outcomes and adverse effects by obtaining specific information from parents, teachers, and the child.

▸ Evaluation of the child with ADHD should include assessment of possible coexisting conditions.

A total of 22 doctors in five pediatric practices—20% of practicing pediatricians in Vermont—participated. A self-administered pediatrician confidence survey served as a baseline measure. In this survey, pediatricians were asked to rate their confidence with various aspects of the diagnosis and treatment of ADHD.

In addition, a preintervention chart audit was conducted to assess adherence to AAP guidelines on several measures. The initial chart review included charts for all 5- to 15-year-old patients with a diagnosis of ADHD after 2001—a total of 225 (75% male patients).

In the survey, 89% of pediatricians responded that they were mostly or highly confident in starting patients on stimulant medication for the treatment of ADHD.

Based on the preintervention chart audit, 92% of charts indicated stimulants had been prescribed for the treatment of ADHD.

A total of 79% of pediatricians responded that they were mostly or highly confident in adjusting stimulant medication, and 72% of charts had evidence of dosage changes after the initial prescription.

Setting targeted outcomes proved to be more of a challenge for pediatricians, with 58% responding that they were mostly or highly confident in setting these, and 38% of charts had evidence of documented targeted outcomes.

Only 37% of pediatricians were mostly or highly confident in arranging for and coordinating nonpharmacologic treatment of ADHD.

However, 68% indicated that they communicated with school personnel most or almost all of the time.

According to the chart audit, parents were involved in treatment planning and monitoring 85% of the time. In addition, 77% of the charts had evidence of consultation with school personnel.

Based on the chart audit, adverse side effects were evaluated 86% of the time, though only 71% of charts had notations of the duration of effectiveness. Only 39% of charts indicated assessment of the adequacy of medication effectiveness.

Based on the survey, only a third (32%) of pediatricians were mostly or highly confident in identifying coexisting psychiatric conditions. And only 21% were mostly or highly confident in treating ADHD coexisting conditions.

Only 32% of charts had notations of coexisting conditions, Dr. Davis, said at the meeting, also sponsored by the American Pediatric Society, the Society for Pediatric Research, the Ambulatory Pediatric Association, and the American Academy of Pediatrics.

WASHINGTON — It appears that there is a wide range of adherence to the American Academy of Pediatrics guidelines on attention-deficit hyperactivity disorder, Wendy Davis, M.D., said at the annual meeting of the Pediatric Academic Societies.

“While [physicians] show a high level of confidence in prescribing and monitoring stimulant medications … few [physicians] in our study practiced in a manner that reflected understanding and documented use of targeted outcomes. Furthermore, our physicians expressed a lack of confidence in their ability to diagnose and treat attention-deficit hyperactivity coexisting conditions,” said Dr. Davis, a professor of pediatrics at the University of Vermont in Burlington.

In 2000, the AAP released clinical practice guidelines for the diagnosis and evaluation of the child with attention-deficit hyperactivity disorder (ADHD). Dr. Davis and her colleagues evaluated a group of pediatricians in Vermont for their adherence to the following selected recommendations from the guidelines:

▸ The clinician should recommend stimulant medication or behavior therapy as appropriate to improve targeted outcomes.

▸ The physician, parents, child, and school personnel should collaborate to identify targeted outcomes to guide management.

▸ The physician should periodically provide a systematic follow-up for the child with ADHD, and monitoring should be directed to targeted outcomes and adverse effects by obtaining specific information from parents, teachers, and the child.

▸ Evaluation of the child with ADHD should include assessment of possible coexisting conditions.

A total of 22 doctors in five pediatric practices—20% of practicing pediatricians in Vermont—participated. A self-administered pediatrician confidence survey served as a baseline measure. In this survey, pediatricians were asked to rate their confidence with various aspects of the diagnosis and treatment of ADHD.

In addition, a preintervention chart audit was conducted to assess adherence to AAP guidelines on several measures. The initial chart review included charts for all 5- to 15-year-old patients with a diagnosis of ADHD after 2001—a total of 225 (75% male patients).

In the survey, 89% of pediatricians responded that they were mostly or highly confident in starting patients on stimulant medication for the treatment of ADHD.

Based on the preintervention chart audit, 92% of charts indicated stimulants had been prescribed for the treatment of ADHD.

A total of 79% of pediatricians responded that they were mostly or highly confident in adjusting stimulant medication, and 72% of charts had evidence of dosage changes after the initial prescription.

Setting targeted outcomes proved to be more of a challenge for pediatricians, with 58% responding that they were mostly or highly confident in setting these, and 38% of charts had evidence of documented targeted outcomes.

Only 37% of pediatricians were mostly or highly confident in arranging for and coordinating nonpharmacologic treatment of ADHD.

However, 68% indicated that they communicated with school personnel most or almost all of the time.

According to the chart audit, parents were involved in treatment planning and monitoring 85% of the time. In addition, 77% of the charts had evidence of consultation with school personnel.

Based on the chart audit, adverse side effects were evaluated 86% of the time, though only 71% of charts had notations of the duration of effectiveness. Only 39% of charts indicated assessment of the adequacy of medication effectiveness.

Based on the survey, only a third (32%) of pediatricians were mostly or highly confident in identifying coexisting psychiatric conditions. And only 21% were mostly or highly confident in treating ADHD coexisting conditions.

Only 32% of charts had notations of coexisting conditions, Dr. Davis, said at the meeting, also sponsored by the American Pediatric Society, the Society for Pediatric Research, the Ambulatory Pediatric Association, and the American Academy of Pediatrics.

Publications
Publications
Topics
Article Type
Display Headline
Physician Adherence to Guidelines for ADHD Varies Widely
Display Headline
Physician Adherence to Guidelines for ADHD Varies Widely
Article Source

PURLs Copyright

Inside the Article

Article PDF Media