User login
Timing of irradiation affects RCCs
ANAHEIM, CA—The timing of gamma irradiation influences in vitro characteristics of red cell concentrates (RCCs), according to a new study.
The research showed that RCCs sustain more damage the longer they are stored prior to gamma irradiation and the longer they are stored after irradiation.
However, RCCs from female donors appeared to be less susceptible to irradiation injury, and the additive solution used seemed to affect the level of injury as well.
Dirk de Korte, PhD, of Sanquin Blood Bank in Amsterdam, Netherlands, presented these results at the 2015 AABB Annual Meeting (abstract S72-040A).
The study included 7 centers, each of which used its standard RCCs. Five centers used SAGM as additive solution, 1 used AS-3, and 1 used PAGGSM. Two centers used whole blood filtration to prepare leukoreduced RCCs, and 5 centers used buffy coat removal and RCC filtration.
Each center produced 4 pools of 7 RCCs, 2 male and 2 female pools. The units were stored for 43 days, and 1 pool was gamma-irradiated every week.
The researchers also performed weekly sampling to assess in vitro quality parameters. They took an extra sample 24 hours after irradiation and 72 hours after irradiation.
The team found that the age of RCCs at the time of irradiation influenced the rate of increase of hemolysis and the absolute level of hemolysis (P<0.0001).
Hemolysis was higher in units irradiated early and then stored. And the rate of change of hemolysis increased if RCCs were stored for longer before irradiation.
The researchers also found that the age of RCCs at the time of irradiation influenced the rate of increase of potassium and the absolute level of potassium (P<0.0001).
The rate of change of potassium decreased if RCCs were stored longer before irradiation, as potassium was already partly released if the cells were stored longer. Within 7 days of irradiation, potassium levels exceeded those observed in control cells stored for 43 days.
Hemolysis and potassium levels also appeared to be affected by donor sex and the additive solution used.
Hemolysis was lower in RCCs from female donors (P=0.045) and in cells exposed to AS-3 or PAGGSM rather than SAGM (P=0.0597).
Potassium release was lower in cells from female donors (P=0.0032) and in cells exposed to AS-3 rather than PAGGSM or SAGM (P=0.0391).
“This study shows or confirms interesting differences between red cells from males and females, and, of course, we are interested in the underlying mechanism,” Dr de Korte said.
He also said the results of this study will be used to formulate guidance on the maximal pre- and post-irradiation storage time for RCCs with respect to either acceptable hemolysis or potassium release.
Dr de Korte said that, if hemolysis is used as guidance, irradiation should be performed within the first 28 days of storage, and the cells should be used within these 28 days.
If potassium is used as guidance, cells should be used within 7 days of irradiation if the irradiation occurs during the first 10 to 14 days of storage, or the cells should be used immediately after irradiation if irradiation takes place later during storage.
ANAHEIM, CA—The timing of gamma irradiation influences in vitro characteristics of red cell concentrates (RCCs), according to a new study.
The research showed that RCCs sustain more damage the longer they are stored prior to gamma irradiation and the longer they are stored after irradiation.
However, RCCs from female donors appeared to be less susceptible to irradiation injury, and the additive solution used seemed to affect the level of injury as well.
Dirk de Korte, PhD, of Sanquin Blood Bank in Amsterdam, Netherlands, presented these results at the 2015 AABB Annual Meeting (abstract S72-040A).
The study included 7 centers, each of which used its standard RCCs. Five centers used SAGM as additive solution, 1 used AS-3, and 1 used PAGGSM. Two centers used whole blood filtration to prepare leukoreduced RCCs, and 5 centers used buffy coat removal and RCC filtration.
Each center produced 4 pools of 7 RCCs, 2 male and 2 female pools. The units were stored for 43 days, and 1 pool was gamma-irradiated every week.
The researchers also performed weekly sampling to assess in vitro quality parameters. They took an extra sample 24 hours after irradiation and 72 hours after irradiation.
The team found that the age of RCCs at the time of irradiation influenced the rate of increase of hemolysis and the absolute level of hemolysis (P<0.0001).
Hemolysis was higher in units irradiated early and then stored. And the rate of change of hemolysis increased if RCCs were stored for longer before irradiation.
The researchers also found that the age of RCCs at the time of irradiation influenced the rate of increase of potassium and the absolute level of potassium (P<0.0001).
The rate of change of potassium decreased if RCCs were stored longer before irradiation, as potassium was already partly released if the cells were stored longer. Within 7 days of irradiation, potassium levels exceeded those observed in control cells stored for 43 days.
Hemolysis and potassium levels also appeared to be affected by donor sex and the additive solution used.
Hemolysis was lower in RCCs from female donors (P=0.045) and in cells exposed to AS-3 or PAGGSM rather than SAGM (P=0.0597).
Potassium release was lower in cells from female donors (P=0.0032) and in cells exposed to AS-3 rather than PAGGSM or SAGM (P=0.0391).
“This study shows or confirms interesting differences between red cells from males and females, and, of course, we are interested in the underlying mechanism,” Dr de Korte said.
He also said the results of this study will be used to formulate guidance on the maximal pre- and post-irradiation storage time for RCCs with respect to either acceptable hemolysis or potassium release.
Dr de Korte said that, if hemolysis is used as guidance, irradiation should be performed within the first 28 days of storage, and the cells should be used within these 28 days.
If potassium is used as guidance, cells should be used within 7 days of irradiation if the irradiation occurs during the first 10 to 14 days of storage, or the cells should be used immediately after irradiation if irradiation takes place later during storage.
ANAHEIM, CA—The timing of gamma irradiation influences in vitro characteristics of red cell concentrates (RCCs), according to a new study.
The research showed that RCCs sustain more damage the longer they are stored prior to gamma irradiation and the longer they are stored after irradiation.
However, RCCs from female donors appeared to be less susceptible to irradiation injury, and the additive solution used seemed to affect the level of injury as well.
Dirk de Korte, PhD, of Sanquin Blood Bank in Amsterdam, Netherlands, presented these results at the 2015 AABB Annual Meeting (abstract S72-040A).
The study included 7 centers, each of which used its standard RCCs. Five centers used SAGM as additive solution, 1 used AS-3, and 1 used PAGGSM. Two centers used whole blood filtration to prepare leukoreduced RCCs, and 5 centers used buffy coat removal and RCC filtration.
Each center produced 4 pools of 7 RCCs, 2 male and 2 female pools. The units were stored for 43 days, and 1 pool was gamma-irradiated every week.
The researchers also performed weekly sampling to assess in vitro quality parameters. They took an extra sample 24 hours after irradiation and 72 hours after irradiation.
The team found that the age of RCCs at the time of irradiation influenced the rate of increase of hemolysis and the absolute level of hemolysis (P<0.0001).
Hemolysis was higher in units irradiated early and then stored. And the rate of change of hemolysis increased if RCCs were stored for longer before irradiation.
The researchers also found that the age of RCCs at the time of irradiation influenced the rate of increase of potassium and the absolute level of potassium (P<0.0001).
The rate of change of potassium decreased if RCCs were stored longer before irradiation, as potassium was already partly released if the cells were stored longer. Within 7 days of irradiation, potassium levels exceeded those observed in control cells stored for 43 days.
Hemolysis and potassium levels also appeared to be affected by donor sex and the additive solution used.
Hemolysis was lower in RCCs from female donors (P=0.045) and in cells exposed to AS-3 or PAGGSM rather than SAGM (P=0.0597).
Potassium release was lower in cells from female donors (P=0.0032) and in cells exposed to AS-3 rather than PAGGSM or SAGM (P=0.0391).
“This study shows or confirms interesting differences between red cells from males and females, and, of course, we are interested in the underlying mechanism,” Dr de Korte said.
He also said the results of this study will be used to formulate guidance on the maximal pre- and post-irradiation storage time for RCCs with respect to either acceptable hemolysis or potassium release.
Dr de Korte said that, if hemolysis is used as guidance, irradiation should be performed within the first 28 days of storage, and the cells should be used within these 28 days.
If potassium is used as guidance, cells should be used within 7 days of irradiation if the irradiation occurs during the first 10 to 14 days of storage, or the cells should be used immediately after irradiation if irradiation takes place later during storage.
Newer apheresis system appears superior to standard
Photo by ec-jpr
ANAHEIM, CA—A newer, more streamlined apheresis system yields more CD34+ cells from stem cell transplant donors than a previous system, according to a new study.
Researchers used both tools—the COBE Spectra Apheresis System and the Spectra Optia Apheresis System—to collect mononuclear cells (MNCs) from healthy donors and found the collection efficiency and yield was superior with the Spectra Optia.
There were no unanticipated or serious adverse events with either system, and the frequency of treatment-emergent adverse events did not differ according to the system used.
Jose A. Cancelas, MD, PhD, of Hoxworth Blood Center in Cincinnati, Ohio, presented the results of this research at the 2015 AABB Annual Meeting (abstract S21-020A). The study was supported by Terumo BCT, the company that makes both systems.
The COBE Spectra Apheresis System collects MNCs via single-step processing and separation. It has been the gold standard for hematopoietic stem and progenitor cell collection since 1987, Dr Cancelas noted.
The newer Spectra Optia Apheresis System uses optical sensors for tracking the separation process and real-time electronic adjustment of plasma pump velocity (automatic interface management). A single-step, continuous MNC collection (CMNC) protocol, which was recently approved for use with this system in the US, is intended to increase automation and MNC collection reproducibility.
To compare the 2 systems, Dr Cancelas and his colleagues conducted a prospective, randomized, crossover study of 22 healthy donors. They had a mean age of 35 and a mean body mass index of 34.2 kg/m2.
The donors underwent 2 MNC collections, first with one apheresis system and then the other. Both times, the donors underwent apheresis on Days 5 and 6 after standard MNC mobilization with granulocyte colony-stimulating factor (G-CSF at 10 mg/kg/day) through Day 5. After the first collection, there was a 2-week washout period.
The study’s primary endpoint was CD34+ cell collection efficiency, which was the percentage of cells collected using the averaged pre/post-collection cell counts as the denominator (CE1). The secondary endpoint was also CD34+ cell collection efficiency, but this was the percentage of cells collected using only the pre-collection cell count as a denominator (CE2).
The researchers also assessed the CD34+ cell yield (CD34+ cells/kg), MNC product contamination/purity, procedure time, product volume, the need for operator involvement, and safety.
Results
All collections processed 1.5 times the total blood volume, and the procedures took nearly 2.5 hours, with no real time difference between the 2 systems.
The average flow rates were 66 mL/minute with the Spectra Optia and 68 mL/minute with COBE Spectra. Product volumes were 143 mL and 139 mL, respectively.
The Optia proved significantly superior to the COBE system with regard to CE1, CE2, and the CD34+ yield.
The mean CD34+ CE1 was 85% with Optia and 66.2% with COBE (P<0.001). The mean CD34+ CE2 was 62% and 48.4%, respectively (P<0.001). And the mean CD34+ yield (cells/kg) was 4.5 and 3.58, respectively (P=0.001).
In addition, granulocyte contamination was lower with the Optia system than the COBE system. The mean granulocyte yield was 7.7 x109 and 10.6 x109 granulocytes per unit, respectively (P=0.022).
However, red blood cell and platelet contaminations were similar between the systems. The mean red blood cell volume was 7.4 mL with Optia and 7.0 mL with COBE (P=0.660). And the mean platelet yield was 4.3 x1011 and 4.6 x1011, respectively (P=0.081).
Overall, there was no significant difference between the Optia and COBE systems in the need for operator adjustments, although there was a trend toward fewer adjustments with the Optia system. It required a median of 5.5 adjustments (range, 0-12), and the COBE system required a median of 6.5 adjustments (range, 1-14).
Dr Cancelas said the frequency of treatment-emergent adverse events did not differ according to the system used. And there were no unanticipated or serious adverse events.
The most frequently reported pre-collection treatment-emergent adverse events were back pain (n=10, 44%), bone pain (n=9, 39%), and fatigue (n=5, 22%).
“These results demonstrate that the Optia CMNC procedure is a safe and efficient means of collecting CD34+ cells in G-CSF mobilized donors,” Dr Cancelas said.
“The Optia collection efficiencies for CD34+ cells were significantly superior to the COBE . . . . And the Optia with automatic interface management system represents a technological advance in our ability to collect CD34+ cells.”
Photo by ec-jpr
ANAHEIM, CA—A newer, more streamlined apheresis system yields more CD34+ cells from stem cell transplant donors than a previous system, according to a new study.
Researchers used both tools—the COBE Spectra Apheresis System and the Spectra Optia Apheresis System—to collect mononuclear cells (MNCs) from healthy donors and found the collection efficiency and yield was superior with the Spectra Optia.
There were no unanticipated or serious adverse events with either system, and the frequency of treatment-emergent adverse events did not differ according to the system used.
Jose A. Cancelas, MD, PhD, of Hoxworth Blood Center in Cincinnati, Ohio, presented the results of this research at the 2015 AABB Annual Meeting (abstract S21-020A). The study was supported by Terumo BCT, the company that makes both systems.
The COBE Spectra Apheresis System collects MNCs via single-step processing and separation. It has been the gold standard for hematopoietic stem and progenitor cell collection since 1987, Dr Cancelas noted.
The newer Spectra Optia Apheresis System uses optical sensors for tracking the separation process and real-time electronic adjustment of plasma pump velocity (automatic interface management). A single-step, continuous MNC collection (CMNC) protocol, which was recently approved for use with this system in the US, is intended to increase automation and MNC collection reproducibility.
To compare the 2 systems, Dr Cancelas and his colleagues conducted a prospective, randomized, crossover study of 22 healthy donors. They had a mean age of 35 and a mean body mass index of 34.2 kg/m2.
The donors underwent 2 MNC collections, first with one apheresis system and then the other. Both times, the donors underwent apheresis on Days 5 and 6 after standard MNC mobilization with granulocyte colony-stimulating factor (G-CSF at 10 mg/kg/day) through Day 5. After the first collection, there was a 2-week washout period.
The study’s primary endpoint was CD34+ cell collection efficiency, which was the percentage of cells collected using the averaged pre/post-collection cell counts as the denominator (CE1). The secondary endpoint was also CD34+ cell collection efficiency, but this was the percentage of cells collected using only the pre-collection cell count as a denominator (CE2).
The researchers also assessed the CD34+ cell yield (CD34+ cells/kg), MNC product contamination/purity, procedure time, product volume, the need for operator involvement, and safety.
Results
All collections processed 1.5 times the total blood volume, and the procedures took nearly 2.5 hours, with no real time difference between the 2 systems.
The average flow rates were 66 mL/minute with the Spectra Optia and 68 mL/minute with COBE Spectra. Product volumes were 143 mL and 139 mL, respectively.
The Optia proved significantly superior to the COBE system with regard to CE1, CE2, and the CD34+ yield.
The mean CD34+ CE1 was 85% with Optia and 66.2% with COBE (P<0.001). The mean CD34+ CE2 was 62% and 48.4%, respectively (P<0.001). And the mean CD34+ yield (cells/kg) was 4.5 and 3.58, respectively (P=0.001).
In addition, granulocyte contamination was lower with the Optia system than the COBE system. The mean granulocyte yield was 7.7 x109 and 10.6 x109 granulocytes per unit, respectively (P=0.022).
However, red blood cell and platelet contaminations were similar between the systems. The mean red blood cell volume was 7.4 mL with Optia and 7.0 mL with COBE (P=0.660). And the mean platelet yield was 4.3 x1011 and 4.6 x1011, respectively (P=0.081).
Overall, there was no significant difference between the Optia and COBE systems in the need for operator adjustments, although there was a trend toward fewer adjustments with the Optia system. It required a median of 5.5 adjustments (range, 0-12), and the COBE system required a median of 6.5 adjustments (range, 1-14).
Dr Cancelas said the frequency of treatment-emergent adverse events did not differ according to the system used. And there were no unanticipated or serious adverse events.
The most frequently reported pre-collection treatment-emergent adverse events were back pain (n=10, 44%), bone pain (n=9, 39%), and fatigue (n=5, 22%).
“These results demonstrate that the Optia CMNC procedure is a safe and efficient means of collecting CD34+ cells in G-CSF mobilized donors,” Dr Cancelas said.
“The Optia collection efficiencies for CD34+ cells were significantly superior to the COBE . . . . And the Optia with automatic interface management system represents a technological advance in our ability to collect CD34+ cells.”
Photo by ec-jpr
ANAHEIM, CA—A newer, more streamlined apheresis system yields more CD34+ cells from stem cell transplant donors than a previous system, according to a new study.
Researchers used both tools—the COBE Spectra Apheresis System and the Spectra Optia Apheresis System—to collect mononuclear cells (MNCs) from healthy donors and found the collection efficiency and yield was superior with the Spectra Optia.
There were no unanticipated or serious adverse events with either system, and the frequency of treatment-emergent adverse events did not differ according to the system used.
Jose A. Cancelas, MD, PhD, of Hoxworth Blood Center in Cincinnati, Ohio, presented the results of this research at the 2015 AABB Annual Meeting (abstract S21-020A). The study was supported by Terumo BCT, the company that makes both systems.
The COBE Spectra Apheresis System collects MNCs via single-step processing and separation. It has been the gold standard for hematopoietic stem and progenitor cell collection since 1987, Dr Cancelas noted.
The newer Spectra Optia Apheresis System uses optical sensors for tracking the separation process and real-time electronic adjustment of plasma pump velocity (automatic interface management). A single-step, continuous MNC collection (CMNC) protocol, which was recently approved for use with this system in the US, is intended to increase automation and MNC collection reproducibility.
To compare the 2 systems, Dr Cancelas and his colleagues conducted a prospective, randomized, crossover study of 22 healthy donors. They had a mean age of 35 and a mean body mass index of 34.2 kg/m2.
The donors underwent 2 MNC collections, first with one apheresis system and then the other. Both times, the donors underwent apheresis on Days 5 and 6 after standard MNC mobilization with granulocyte colony-stimulating factor (G-CSF at 10 mg/kg/day) through Day 5. After the first collection, there was a 2-week washout period.
The study’s primary endpoint was CD34+ cell collection efficiency, which was the percentage of cells collected using the averaged pre/post-collection cell counts as the denominator (CE1). The secondary endpoint was also CD34+ cell collection efficiency, but this was the percentage of cells collected using only the pre-collection cell count as a denominator (CE2).
The researchers also assessed the CD34+ cell yield (CD34+ cells/kg), MNC product contamination/purity, procedure time, product volume, the need for operator involvement, and safety.
Results
All collections processed 1.5 times the total blood volume, and the procedures took nearly 2.5 hours, with no real time difference between the 2 systems.
The average flow rates were 66 mL/minute with the Spectra Optia and 68 mL/minute with COBE Spectra. Product volumes were 143 mL and 139 mL, respectively.
The Optia proved significantly superior to the COBE system with regard to CE1, CE2, and the CD34+ yield.
The mean CD34+ CE1 was 85% with Optia and 66.2% with COBE (P<0.001). The mean CD34+ CE2 was 62% and 48.4%, respectively (P<0.001). And the mean CD34+ yield (cells/kg) was 4.5 and 3.58, respectively (P=0.001).
In addition, granulocyte contamination was lower with the Optia system than the COBE system. The mean granulocyte yield was 7.7 x109 and 10.6 x109 granulocytes per unit, respectively (P=0.022).
However, red blood cell and platelet contaminations were similar between the systems. The mean red blood cell volume was 7.4 mL with Optia and 7.0 mL with COBE (P=0.660). And the mean platelet yield was 4.3 x1011 and 4.6 x1011, respectively (P=0.081).
Overall, there was no significant difference between the Optia and COBE systems in the need for operator adjustments, although there was a trend toward fewer adjustments with the Optia system. It required a median of 5.5 adjustments (range, 0-12), and the COBE system required a median of 6.5 adjustments (range, 1-14).
Dr Cancelas said the frequency of treatment-emergent adverse events did not differ according to the system used. And there were no unanticipated or serious adverse events.
The most frequently reported pre-collection treatment-emergent adverse events were back pain (n=10, 44%), bone pain (n=9, 39%), and fatigue (n=5, 22%).
“These results demonstrate that the Optia CMNC procedure is a safe and efficient means of collecting CD34+ cells in G-CSF mobilized donors,” Dr Cancelas said.
“The Optia collection efficiencies for CD34+ cells were significantly superior to the COBE . . . . And the Optia with automatic interface management system represents a technological advance in our ability to collect CD34+ cells.”
Survey: 3 in 10 MSMs don’t comply with UK blood donor policy
Photo by Marja Helander
ANAHEIM—A survey of UK blood donors suggests that as many as 30% of donors who are men who have sex with men (MSM) may not be compliant with the MSM blood donor policy.
The UK’s policy requires that MSMs do not donate blood if they have engaged in sexual activity with another male in the last 12 months.
But the survey indicates that as many as 3 in 10 MSMs are disregarding this policy.
The research also suggests that MSMs who do not comply with the policy engage in riskier sexual behavior than non-MSM male blood donors.
However, the researchers found no increase in the number of sexually transmitted infections present in the blood supply since the donation policy for MSMs changed from a lifetime ban to a 12-month deferral period.
The infections evaluated include human immunodeficiency virus (HIV), hepatitis C virus (HCV), hepatitis B virus (HBV), and syphilis.
The researchers also emphasized that the prevalence of HIV-positive blood donations in the UK remains low.
Katie Davidson, of Public Health England in London, presented these findings at the 2015 AABB Annual Meeting (abstract S36-030E*).
She noted that, in 2011, the blood services of England, Wales, and Scotland changed the blood donor policy for MSMs from a lifetime ban to a 12-month deferral since last male-to-male sex.
Before this policy change took effect, the blood services estimated that the change would mean 2679 MSMs would be newly eligible to donate blood (0.7% of male donors), and 8 of these donors would have HIV. So there would be a 0.5% increase in HIV risk.
“But what was clear was that these predictions in terms of HIV risk would be very dependent upon compliance,” Davidson said. “And what we mean by compliance is that a donor understands the rule, applies it correctly, and discloses any relevant information when they’re asked.”
To investigate donor behavior and compliance, Davidson and her colleagues conducted a large-scale, anonymous, web-based survey of blood donors.
Each month for 1 year (2013), all eligible new blood donors and at least an equal number of repeat blood donors in the UK were invited, via email, to complete an online questionnaire asking about their sexual history and compliance with the 12-month deferral policy for MSM (if applicable).
The researchers also looked at UK surveillance data on infections (HIV, HBV, HCV, and syphilis) in new and repeat blood donors over 6 years, comparing the incidence of infections before and after the policy change took effect (3 years pre- and post-change).
Donation and compliance
Among 65,439 survey respondents, 22,776 (35%) were male, and 242 (1%) were MSMs. Among the MSMs, 73 reported male-to-male sex within the last 12 months (non-compliance), and 181 said it had been more than 12 months since their last male-to-male sexual encounter.
The researchers adjusted these proportions for differences among the respondents and the donor population and extrapolated the data to the whole UK donor population.
The team estimated that, among 488,523 UK donors, there would be 5471 MSMs. Of the MSM donors, 3713 would be eligible under the new policy, and 1759 would be non-compliant.
So MSM compliance with the 12-month deferral policy would be 99.7% among all male donors but 70.4% of the MSM population.
“So 3 in every 10 MSMs donating blood in the UK shouldn’t be, [according to the estimates],” Davidson said.
The survey asked non-compliant MSM donors to provide their reasons for non-compliance, and many gave more than 1 reason.
“The reasons seemed to be associated, mostly, with self-assessment of their own risk [of transmitting infection] to be low,” Davidson said. “So that was based on the fact that they were in a monogamous relationship, they used condoms, practiced safe sex, or had regular [sexual health] screenings.”
However, there were some donors who regarded the policy as unimportant or said they didn’t agree with it. And there were some donors who didn’t declare their sexual behavior because they knew they wouldn’t be allowed to donate if they did.
Sexual behavior
Among all male respondents who reported having sex within the last 12 months, MSMs were more likely than men who had only female sexual partners to report having sex with more than 1 partner. Fifty percent of MSMs had more than 1 sexual partner in the last 12 months, as did 9.1% of male donors with only female sexual partners.
Ten percent of MSMs reported paying for sex, as did 0.3% of non-MSMs. None of the MSMs reported having a partner who was HIV-positive, and less than 0.1% of non-MSMs said they had an HIV-positive partner.
Eleven percent of MSMs said they had a history of sexually transmitted infection, as did less than 0.1% of non-MSMs.
“So among the responders, there was very low numbers who reported a high-risk partner in the last 12 months,” Davidson noted. “But there was some suggestion, among these low numbers, that this was more common in the MSMs than the non-MSMs.”
She also acknowledged that some donors were unsure about whether they had a high-risk partner in the last 12 months.
Infections
The UK surveillance data on infections encompassed HIV, HBV, HCV, and syphilis.
In all, 3,667,408 blood donations from males were tested for infection in the 3 years prior to the MSM donor policy change, and 3,066,076 donations were tested in the 3 years after the change was implemented.
There were 428 donors who reported having an infection risk before the change and 268 who did so after. There were 577 donors who actually had an infection before the change and 434 who did after. And there were 32 infected MSM donors before the change and 34 after.
“So the number of male donors fell post-change by approximately 20%, [and] the total number of infected donors . . . fell by almost 30%,” Davidson noted.
“However, the number of MSM infected donors marginally increased, [and] the proportion of male infected donors who were MSMs, among all of those who reported a risk, increased from 7% [32/428] to 13% [34/268]. So there seems to be some impact [on infection] from MSMs, but the numbers are very small, and these differences are not significant.”
Predictions and HIV infection
Finally, the researchers compared their predictions from before the MSM blood donor policy change to the actual data after the change. This comparison assumed that the absolute number of compliant MSMs did not change after the policy changed.
In 2007, the group predicted there would be about 2 million blood donations, including 2679 from MSMs. In reality, in 2014, there were 1.9 million blood donations, including 3126 from MSMs.
The researchers predicted the number of HIV-positive donations would be 30, including 8 from MSMs. In reality, in 2014, there were 13 HIV-positive donations, including 1 from an MSM.
So the predicted HIV prevalence per 100,000 donations was 1.4, and the actual HIV prevalence was 0.7. The predicted HIV incidence per 100,000 person-years was 0.9, and the actual HIV incidence was 0.7.
The predicted HIV risk was 0.022 per 100,000, and the actual HIV risk was 0.016 per 100,000.
“So the estimated risk of HIV post-change remains very low,” Davidson noted, adding that she and her colleagues will continue to monitor the impact of the policy change.
*Data in the abstract differ from data presented at the meeting.
Photo by Marja Helander
ANAHEIM—A survey of UK blood donors suggests that as many as 30% of donors who are men who have sex with men (MSM) may not be compliant with the MSM blood donor policy.
The UK’s policy requires that MSMs do not donate blood if they have engaged in sexual activity with another male in the last 12 months.
But the survey indicates that as many as 3 in 10 MSMs are disregarding this policy.
The research also suggests that MSMs who do not comply with the policy engage in riskier sexual behavior than non-MSM male blood donors.
However, the researchers found no increase in the number of sexually transmitted infections present in the blood supply since the donation policy for MSMs changed from a lifetime ban to a 12-month deferral period.
The infections evaluated include human immunodeficiency virus (HIV), hepatitis C virus (HCV), hepatitis B virus (HBV), and syphilis.
The researchers also emphasized that the prevalence of HIV-positive blood donations in the UK remains low.
Katie Davidson, of Public Health England in London, presented these findings at the 2015 AABB Annual Meeting (abstract S36-030E*).
She noted that, in 2011, the blood services of England, Wales, and Scotland changed the blood donor policy for MSMs from a lifetime ban to a 12-month deferral since last male-to-male sex.
Before this policy change took effect, the blood services estimated that the change would mean 2679 MSMs would be newly eligible to donate blood (0.7% of male donors), and 8 of these donors would have HIV. So there would be a 0.5% increase in HIV risk.
“But what was clear was that these predictions in terms of HIV risk would be very dependent upon compliance,” Davidson said. “And what we mean by compliance is that a donor understands the rule, applies it correctly, and discloses any relevant information when they’re asked.”
To investigate donor behavior and compliance, Davidson and her colleagues conducted a large-scale, anonymous, web-based survey of blood donors.
Each month for 1 year (2013), all eligible new blood donors and at least an equal number of repeat blood donors in the UK were invited, via email, to complete an online questionnaire asking about their sexual history and compliance with the 12-month deferral policy for MSM (if applicable).
The researchers also looked at UK surveillance data on infections (HIV, HBV, HCV, and syphilis) in new and repeat blood donors over 6 years, comparing the incidence of infections before and after the policy change took effect (3 years pre- and post-change).
Donation and compliance
Among 65,439 survey respondents, 22,776 (35%) were male, and 242 (1%) were MSMs. Among the MSMs, 73 reported male-to-male sex within the last 12 months (non-compliance), and 181 said it had been more than 12 months since their last male-to-male sexual encounter.
The researchers adjusted these proportions for differences among the respondents and the donor population and extrapolated the data to the whole UK donor population.
The team estimated that, among 488,523 UK donors, there would be 5471 MSMs. Of the MSM donors, 3713 would be eligible under the new policy, and 1759 would be non-compliant.
So MSM compliance with the 12-month deferral policy would be 99.7% among all male donors but 70.4% of the MSM population.
“So 3 in every 10 MSMs donating blood in the UK shouldn’t be, [according to the estimates],” Davidson said.
The survey asked non-compliant MSM donors to provide their reasons for non-compliance, and many gave more than 1 reason.
“The reasons seemed to be associated, mostly, with self-assessment of their own risk [of transmitting infection] to be low,” Davidson said. “So that was based on the fact that they were in a monogamous relationship, they used condoms, practiced safe sex, or had regular [sexual health] screenings.”
However, there were some donors who regarded the policy as unimportant or said they didn’t agree with it. And there were some donors who didn’t declare their sexual behavior because they knew they wouldn’t be allowed to donate if they did.
Sexual behavior
Among all male respondents who reported having sex within the last 12 months, MSMs were more likely than men who had only female sexual partners to report having sex with more than 1 partner. Fifty percent of MSMs had more than 1 sexual partner in the last 12 months, as did 9.1% of male donors with only female sexual partners.
Ten percent of MSMs reported paying for sex, as did 0.3% of non-MSMs. None of the MSMs reported having a partner who was HIV-positive, and less than 0.1% of non-MSMs said they had an HIV-positive partner.
Eleven percent of MSMs said they had a history of sexually transmitted infection, as did less than 0.1% of non-MSMs.
“So among the responders, there was very low numbers who reported a high-risk partner in the last 12 months,” Davidson noted. “But there was some suggestion, among these low numbers, that this was more common in the MSMs than the non-MSMs.”
She also acknowledged that some donors were unsure about whether they had a high-risk partner in the last 12 months.
Infections
The UK surveillance data on infections encompassed HIV, HBV, HCV, and syphilis.
In all, 3,667,408 blood donations from males were tested for infection in the 3 years prior to the MSM donor policy change, and 3,066,076 donations were tested in the 3 years after the change was implemented.
There were 428 donors who reported having an infection risk before the change and 268 who did so after. There were 577 donors who actually had an infection before the change and 434 who did after. And there were 32 infected MSM donors before the change and 34 after.
“So the number of male donors fell post-change by approximately 20%, [and] the total number of infected donors . . . fell by almost 30%,” Davidson noted.
“However, the number of MSM infected donors marginally increased, [and] the proportion of male infected donors who were MSMs, among all of those who reported a risk, increased from 7% [32/428] to 13% [34/268]. So there seems to be some impact [on infection] from MSMs, but the numbers are very small, and these differences are not significant.”
Predictions and HIV infection
Finally, the researchers compared their predictions from before the MSM blood donor policy change to the actual data after the change. This comparison assumed that the absolute number of compliant MSMs did not change after the policy changed.
In 2007, the group predicted there would be about 2 million blood donations, including 2679 from MSMs. In reality, in 2014, there were 1.9 million blood donations, including 3126 from MSMs.
The researchers predicted the number of HIV-positive donations would be 30, including 8 from MSMs. In reality, in 2014, there were 13 HIV-positive donations, including 1 from an MSM.
So the predicted HIV prevalence per 100,000 donations was 1.4, and the actual HIV prevalence was 0.7. The predicted HIV incidence per 100,000 person-years was 0.9, and the actual HIV incidence was 0.7.
The predicted HIV risk was 0.022 per 100,000, and the actual HIV risk was 0.016 per 100,000.
“So the estimated risk of HIV post-change remains very low,” Davidson noted, adding that she and her colleagues will continue to monitor the impact of the policy change.
*Data in the abstract differ from data presented at the meeting.
Photo by Marja Helander
ANAHEIM—A survey of UK blood donors suggests that as many as 30% of donors who are men who have sex with men (MSM) may not be compliant with the MSM blood donor policy.
The UK’s policy requires that MSMs do not donate blood if they have engaged in sexual activity with another male in the last 12 months.
But the survey indicates that as many as 3 in 10 MSMs are disregarding this policy.
The research also suggests that MSMs who do not comply with the policy engage in riskier sexual behavior than non-MSM male blood donors.
However, the researchers found no increase in the number of sexually transmitted infections present in the blood supply since the donation policy for MSMs changed from a lifetime ban to a 12-month deferral period.
The infections evaluated include human immunodeficiency virus (HIV), hepatitis C virus (HCV), hepatitis B virus (HBV), and syphilis.
The researchers also emphasized that the prevalence of HIV-positive blood donations in the UK remains low.
Katie Davidson, of Public Health England in London, presented these findings at the 2015 AABB Annual Meeting (abstract S36-030E*).
She noted that, in 2011, the blood services of England, Wales, and Scotland changed the blood donor policy for MSMs from a lifetime ban to a 12-month deferral since last male-to-male sex.
Before this policy change took effect, the blood services estimated that the change would mean 2679 MSMs would be newly eligible to donate blood (0.7% of male donors), and 8 of these donors would have HIV. So there would be a 0.5% increase in HIV risk.
“But what was clear was that these predictions in terms of HIV risk would be very dependent upon compliance,” Davidson said. “And what we mean by compliance is that a donor understands the rule, applies it correctly, and discloses any relevant information when they’re asked.”
To investigate donor behavior and compliance, Davidson and her colleagues conducted a large-scale, anonymous, web-based survey of blood donors.
Each month for 1 year (2013), all eligible new blood donors and at least an equal number of repeat blood donors in the UK were invited, via email, to complete an online questionnaire asking about their sexual history and compliance with the 12-month deferral policy for MSM (if applicable).
The researchers also looked at UK surveillance data on infections (HIV, HBV, HCV, and syphilis) in new and repeat blood donors over 6 years, comparing the incidence of infections before and after the policy change took effect (3 years pre- and post-change).
Donation and compliance
Among 65,439 survey respondents, 22,776 (35%) were male, and 242 (1%) were MSMs. Among the MSMs, 73 reported male-to-male sex within the last 12 months (non-compliance), and 181 said it had been more than 12 months since their last male-to-male sexual encounter.
The researchers adjusted these proportions for differences among the respondents and the donor population and extrapolated the data to the whole UK donor population.
The team estimated that, among 488,523 UK donors, there would be 5471 MSMs. Of the MSM donors, 3713 would be eligible under the new policy, and 1759 would be non-compliant.
So MSM compliance with the 12-month deferral policy would be 99.7% among all male donors but 70.4% of the MSM population.
“So 3 in every 10 MSMs donating blood in the UK shouldn’t be, [according to the estimates],” Davidson said.
The survey asked non-compliant MSM donors to provide their reasons for non-compliance, and many gave more than 1 reason.
“The reasons seemed to be associated, mostly, with self-assessment of their own risk [of transmitting infection] to be low,” Davidson said. “So that was based on the fact that they were in a monogamous relationship, they used condoms, practiced safe sex, or had regular [sexual health] screenings.”
However, there were some donors who regarded the policy as unimportant or said they didn’t agree with it. And there were some donors who didn’t declare their sexual behavior because they knew they wouldn’t be allowed to donate if they did.
Sexual behavior
Among all male respondents who reported having sex within the last 12 months, MSMs were more likely than men who had only female sexual partners to report having sex with more than 1 partner. Fifty percent of MSMs had more than 1 sexual partner in the last 12 months, as did 9.1% of male donors with only female sexual partners.
Ten percent of MSMs reported paying for sex, as did 0.3% of non-MSMs. None of the MSMs reported having a partner who was HIV-positive, and less than 0.1% of non-MSMs said they had an HIV-positive partner.
Eleven percent of MSMs said they had a history of sexually transmitted infection, as did less than 0.1% of non-MSMs.
“So among the responders, there was very low numbers who reported a high-risk partner in the last 12 months,” Davidson noted. “But there was some suggestion, among these low numbers, that this was more common in the MSMs than the non-MSMs.”
She also acknowledged that some donors were unsure about whether they had a high-risk partner in the last 12 months.
Infections
The UK surveillance data on infections encompassed HIV, HBV, HCV, and syphilis.
In all, 3,667,408 blood donations from males were tested for infection in the 3 years prior to the MSM donor policy change, and 3,066,076 donations were tested in the 3 years after the change was implemented.
There were 428 donors who reported having an infection risk before the change and 268 who did so after. There were 577 donors who actually had an infection before the change and 434 who did after. And there were 32 infected MSM donors before the change and 34 after.
“So the number of male donors fell post-change by approximately 20%, [and] the total number of infected donors . . . fell by almost 30%,” Davidson noted.
“However, the number of MSM infected donors marginally increased, [and] the proportion of male infected donors who were MSMs, among all of those who reported a risk, increased from 7% [32/428] to 13% [34/268]. So there seems to be some impact [on infection] from MSMs, but the numbers are very small, and these differences are not significant.”
Predictions and HIV infection
Finally, the researchers compared their predictions from before the MSM blood donor policy change to the actual data after the change. This comparison assumed that the absolute number of compliant MSMs did not change after the policy changed.
In 2007, the group predicted there would be about 2 million blood donations, including 2679 from MSMs. In reality, in 2014, there were 1.9 million blood donations, including 3126 from MSMs.
The researchers predicted the number of HIV-positive donations would be 30, including 8 from MSMs. In reality, in 2014, there were 13 HIV-positive donations, including 1 from an MSM.
So the predicted HIV prevalence per 100,000 donations was 1.4, and the actual HIV prevalence was 0.7. The predicted HIV incidence per 100,000 person-years was 0.9, and the actual HIV incidence was 0.7.
The predicted HIV risk was 0.022 per 100,000, and the actual HIV risk was 0.016 per 100,000.
“So the estimated risk of HIV post-change remains very low,” Davidson noted, adding that she and her colleagues will continue to monitor the impact of the policy change.
*Data in the abstract differ from data presented at the meeting.
Interventions can treat, prevent iron deficiency in blood donors
ANAHEIM, CA—Data from the STRIDE study have revealed interventions that can mitigate iron deficiency in repeat blood donors.
The study showed that providing repeat blood donors with iron supplements significantly improved their iron status.
But informing donors about their ferritin levels and recommending they take iron pills also significantly improved their iron status.
Meanwhile, patients in control groups became more iron-deficient over the study period.
The study also revealed no difference in ferritin or hemoglobin levels between patients who took 19 mg of iron and those who took 38 mg.
Alan E. Mast, MD, PhD, of the Blood Center of Wisconsin in Milwaukee, presented these results at the 2015 AABB Annual Meeting (abstract S34-030E).
Dr Mast said blood donation removes a lot of iron, and iron is used to make hemoglobin in new red blood cells. But the measurement of hemoglobin does not accurately reflect iron stores.
“That’s really important,” he said. “The only test we do to qualify a blood donor doesn’t tell us if they have iron deficiency or not. And because of that, many regular blood donors become iron-deficient and continue to donate blood.”
Dr Mast said the strategies that appear to mitigate iron deficiency in regular blood donors are oral iron supplements and delaying the donation interval for more than 6 months.
“[However,] the effectiveness of providing iron pills versus providing the donor with information about their iron status has not been previously examined,” he noted.
This was the goal of the STRIDE (Strategies to Reduce Iron Deficiency) study.
Study design
This blinded, randomized, placebo-controlled study enrolled 692 frequent blood donors from 3 blood centers. They were assigned to 1 of 5 arms for 2 years of follow-up.
In 3 arms, donors received pills for 60 days after each donation. They received 38 mg or 19 mg of elemental iron, or they received a placebo.
Donors in the remaining 2 arms received letters after each donation—either a letter informing them of their iron status or a “control” letter thanking them for donating blood and urging them to donate again.
Every iron status letter reported the donor’s ferritin level. If the level was >26 mg/L, the letter simply urged donors to donate again. If the ferritin level was ≤26 mg/L, the letter recommended taking a self-purchased iron supplement (17 mg to 38 mg) and/or delaying donation for 6 months. Donors were allowed to choose either option, both, or neither.
The researchers measured ferritin, soluble transferrin receptor, and complete blood count at each donation.
Study completion
Of the 692 subjects randomized, 393 completed a final visit. The researchers noted that a donor’s ferritin level at enrollment, race, or gender did not impact study completion. However, older subjects were more likely to complete the study.
In all, 116 subjects were lost to follow-up, and the numbers were similar between the study arms. Thirty-nine subjects discontinued due to adverse events—16 in the 38 mg iron group, 12 in the 19 mg iron group, and 11 in the placebo group.
And 144 subjects discontinued for “other reasons”—9 in the iron status letter arm, 10 in the control letter arm, 30 in the 38 mg iron arm, 42 in the 19 mg iron arm, and 53 in the placebo arm.
Subjects’ reasons for discontinuation included not wanting to take a pill every day, believing they are in the placebo group and wanting to take iron, and subjects’ physicians recommending they start taking iron.
“Donors in pill arms de-enrolled more frequently than those in the letter arms, and the important thing to remember is that this is a controlled, randomized study where the donors did not know what they were taking,” Dr Mast said. “And I think that, a lot of the time, if donors had known what they were taking, they might have continued to participate in the study or continued to take the pills.”
Results
Dr Mast noted that, at the study’s end, all measures of iron deficiency were statistically indistinguishable in the 3 intervention arms, which were statistically different from the 2 control arms.
Between study enrollment and the donors’ final visit, the prevalence of ferritin <26 mg/L was unchanged in the control groups. But it had declined by more than 50% in the 3 intervention groups—19 mg iron, 38 mg iron, and iron status letter (P<0.0001 for all 3).
The prevalence of ferritin <12 mg/L was unchanged in the 2 control arms, but it had declined by more than 70% in the 3 intervention arms—19 mg iron (P<0.0001), 38 mg iron (P<0.01), and iron status letter (P<0.0001).
The researchers also calculated the odds ratios for iron deficiency over all donor visits. The odds for ferritin <26 or <12 mg/L decreased more than 80% in the 19 mg iron group (P<0.01 for both ferritin measurements) and the 38 mg iron group (P<0.01 for both).
The odds for ferritin <26 or <12 mg/L decreased about 50% in the iron status letter arm (P<0.01 for both).
And the odds for ferritin <12 mg/L increased about 50% in the control groups (P<0.01 for both the placebo and control letter groups). However, there was no significant difference for ferritin <26 mg/L in either control group.
Lastly, the researchers performed longitudinal modeling of hemoglobin. They found that hemoglobin increased >0.03 g/dL in the 19 mg and 38 mg iron arms (P<0.01 for both), decreasing the odds for low hemoglobin deferral about 50%.
Hemoglobin decreased >0.3 g/dL in the control groups (P<0.0001 for both the placebo and control letter groups), increasing the odds for low hemoglobin deferral about 70%.
“Interestingly, [being] in the iron status letter group did not affect hemoglobin that much in the longitudinal modeling of the donors,” Dr Mast noted.
In closing, he pointed out that the 19 mg and 38 mg iron pills were equally effective for mitigating iron deficiency and improving hemoglobin in these blood donors.
“From a physiology point of view, I think this is one of the most important results of this study,” Dr Mast said. “There’s absolutely no difference. There was no trend for 38 mg to be better than 19 in any analysis that we did.”
“There’s lots of reasons that could be happening, but I think it’s scientifically interesting and operationally interesting. And it’s important because we can tell donors—ask them to take a multivitamin with 19 mg of iron, and that will be sufficient to treat iron deficiency.”
ANAHEIM, CA—Data from the STRIDE study have revealed interventions that can mitigate iron deficiency in repeat blood donors.
The study showed that providing repeat blood donors with iron supplements significantly improved their iron status.
But informing donors about their ferritin levels and recommending they take iron pills also significantly improved their iron status.
Meanwhile, patients in control groups became more iron-deficient over the study period.
The study also revealed no difference in ferritin or hemoglobin levels between patients who took 19 mg of iron and those who took 38 mg.
Alan E. Mast, MD, PhD, of the Blood Center of Wisconsin in Milwaukee, presented these results at the 2015 AABB Annual Meeting (abstract S34-030E).
Dr Mast said blood donation removes a lot of iron, and iron is used to make hemoglobin in new red blood cells. But the measurement of hemoglobin does not accurately reflect iron stores.
“That’s really important,” he said. “The only test we do to qualify a blood donor doesn’t tell us if they have iron deficiency or not. And because of that, many regular blood donors become iron-deficient and continue to donate blood.”
Dr Mast said the strategies that appear to mitigate iron deficiency in regular blood donors are oral iron supplements and delaying the donation interval for more than 6 months.
“[However,] the effectiveness of providing iron pills versus providing the donor with information about their iron status has not been previously examined,” he noted.
This was the goal of the STRIDE (Strategies to Reduce Iron Deficiency) study.
Study design
This blinded, randomized, placebo-controlled study enrolled 692 frequent blood donors from 3 blood centers. They were assigned to 1 of 5 arms for 2 years of follow-up.
In 3 arms, donors received pills for 60 days after each donation. They received 38 mg or 19 mg of elemental iron, or they received a placebo.
Donors in the remaining 2 arms received letters after each donation—either a letter informing them of their iron status or a “control” letter thanking them for donating blood and urging them to donate again.
Every iron status letter reported the donor’s ferritin level. If the level was >26 mg/L, the letter simply urged donors to donate again. If the ferritin level was ≤26 mg/L, the letter recommended taking a self-purchased iron supplement (17 mg to 38 mg) and/or delaying donation for 6 months. Donors were allowed to choose either option, both, or neither.
The researchers measured ferritin, soluble transferrin receptor, and complete blood count at each donation.
Study completion
Of the 692 subjects randomized, 393 completed a final visit. The researchers noted that a donor’s ferritin level at enrollment, race, or gender did not impact study completion. However, older subjects were more likely to complete the study.
In all, 116 subjects were lost to follow-up, and the numbers were similar between the study arms. Thirty-nine subjects discontinued due to adverse events—16 in the 38 mg iron group, 12 in the 19 mg iron group, and 11 in the placebo group.
And 144 subjects discontinued for “other reasons”—9 in the iron status letter arm, 10 in the control letter arm, 30 in the 38 mg iron arm, 42 in the 19 mg iron arm, and 53 in the placebo arm.
Subjects’ reasons for discontinuation included not wanting to take a pill every day, believing they are in the placebo group and wanting to take iron, and subjects’ physicians recommending they start taking iron.
“Donors in pill arms de-enrolled more frequently than those in the letter arms, and the important thing to remember is that this is a controlled, randomized study where the donors did not know what they were taking,” Dr Mast said. “And I think that, a lot of the time, if donors had known what they were taking, they might have continued to participate in the study or continued to take the pills.”
Results
Dr Mast noted that, at the study’s end, all measures of iron deficiency were statistically indistinguishable in the 3 intervention arms, which were statistically different from the 2 control arms.
Between study enrollment and the donors’ final visit, the prevalence of ferritin <26 mg/L was unchanged in the control groups. But it had declined by more than 50% in the 3 intervention groups—19 mg iron, 38 mg iron, and iron status letter (P<0.0001 for all 3).
The prevalence of ferritin <12 mg/L was unchanged in the 2 control arms, but it had declined by more than 70% in the 3 intervention arms—19 mg iron (P<0.0001), 38 mg iron (P<0.01), and iron status letter (P<0.0001).
The researchers also calculated the odds ratios for iron deficiency over all donor visits. The odds for ferritin <26 or <12 mg/L decreased more than 80% in the 19 mg iron group (P<0.01 for both ferritin measurements) and the 38 mg iron group (P<0.01 for both).
The odds for ferritin <26 or <12 mg/L decreased about 50% in the iron status letter arm (P<0.01 for both).
And the odds for ferritin <12 mg/L increased about 50% in the control groups (P<0.01 for both the placebo and control letter groups). However, there was no significant difference for ferritin <26 mg/L in either control group.
Lastly, the researchers performed longitudinal modeling of hemoglobin. They found that hemoglobin increased >0.03 g/dL in the 19 mg and 38 mg iron arms (P<0.01 for both), decreasing the odds for low hemoglobin deferral about 50%.
Hemoglobin decreased >0.3 g/dL in the control groups (P<0.0001 for both the placebo and control letter groups), increasing the odds for low hemoglobin deferral about 70%.
“Interestingly, [being] in the iron status letter group did not affect hemoglobin that much in the longitudinal modeling of the donors,” Dr Mast noted.
In closing, he pointed out that the 19 mg and 38 mg iron pills were equally effective for mitigating iron deficiency and improving hemoglobin in these blood donors.
“From a physiology point of view, I think this is one of the most important results of this study,” Dr Mast said. “There’s absolutely no difference. There was no trend for 38 mg to be better than 19 in any analysis that we did.”
“There’s lots of reasons that could be happening, but I think it’s scientifically interesting and operationally interesting. And it’s important because we can tell donors—ask them to take a multivitamin with 19 mg of iron, and that will be sufficient to treat iron deficiency.”
ANAHEIM, CA—Data from the STRIDE study have revealed interventions that can mitigate iron deficiency in repeat blood donors.
The study showed that providing repeat blood donors with iron supplements significantly improved their iron status.
But informing donors about their ferritin levels and recommending they take iron pills also significantly improved their iron status.
Meanwhile, patients in control groups became more iron-deficient over the study period.
The study also revealed no difference in ferritin or hemoglobin levels between patients who took 19 mg of iron and those who took 38 mg.
Alan E. Mast, MD, PhD, of the Blood Center of Wisconsin in Milwaukee, presented these results at the 2015 AABB Annual Meeting (abstract S34-030E).
Dr Mast said blood donation removes a lot of iron, and iron is used to make hemoglobin in new red blood cells. But the measurement of hemoglobin does not accurately reflect iron stores.
“That’s really important,” he said. “The only test we do to qualify a blood donor doesn’t tell us if they have iron deficiency or not. And because of that, many regular blood donors become iron-deficient and continue to donate blood.”
Dr Mast said the strategies that appear to mitigate iron deficiency in regular blood donors are oral iron supplements and delaying the donation interval for more than 6 months.
“[However,] the effectiveness of providing iron pills versus providing the donor with information about their iron status has not been previously examined,” he noted.
This was the goal of the STRIDE (Strategies to Reduce Iron Deficiency) study.
Study design
This blinded, randomized, placebo-controlled study enrolled 692 frequent blood donors from 3 blood centers. They were assigned to 1 of 5 arms for 2 years of follow-up.
In 3 arms, donors received pills for 60 days after each donation. They received 38 mg or 19 mg of elemental iron, or they received a placebo.
Donors in the remaining 2 arms received letters after each donation—either a letter informing them of their iron status or a “control” letter thanking them for donating blood and urging them to donate again.
Every iron status letter reported the donor’s ferritin level. If the level was >26 mg/L, the letter simply urged donors to donate again. If the ferritin level was ≤26 mg/L, the letter recommended taking a self-purchased iron supplement (17 mg to 38 mg) and/or delaying donation for 6 months. Donors were allowed to choose either option, both, or neither.
The researchers measured ferritin, soluble transferrin receptor, and complete blood count at each donation.
Study completion
Of the 692 subjects randomized, 393 completed a final visit. The researchers noted that a donor’s ferritin level at enrollment, race, or gender did not impact study completion. However, older subjects were more likely to complete the study.
In all, 116 subjects were lost to follow-up, and the numbers were similar between the study arms. Thirty-nine subjects discontinued due to adverse events—16 in the 38 mg iron group, 12 in the 19 mg iron group, and 11 in the placebo group.
And 144 subjects discontinued for “other reasons”—9 in the iron status letter arm, 10 in the control letter arm, 30 in the 38 mg iron arm, 42 in the 19 mg iron arm, and 53 in the placebo arm.
Subjects’ reasons for discontinuation included not wanting to take a pill every day, believing they are in the placebo group and wanting to take iron, and subjects’ physicians recommending they start taking iron.
“Donors in pill arms de-enrolled more frequently than those in the letter arms, and the important thing to remember is that this is a controlled, randomized study where the donors did not know what they were taking,” Dr Mast said. “And I think that, a lot of the time, if donors had known what they were taking, they might have continued to participate in the study or continued to take the pills.”
Results
Dr Mast noted that, at the study’s end, all measures of iron deficiency were statistically indistinguishable in the 3 intervention arms, which were statistically different from the 2 control arms.
Between study enrollment and the donors’ final visit, the prevalence of ferritin <26 mg/L was unchanged in the control groups. But it had declined by more than 50% in the 3 intervention groups—19 mg iron, 38 mg iron, and iron status letter (P<0.0001 for all 3).
The prevalence of ferritin <12 mg/L was unchanged in the 2 control arms, but it had declined by more than 70% in the 3 intervention arms—19 mg iron (P<0.0001), 38 mg iron (P<0.01), and iron status letter (P<0.0001).
The researchers also calculated the odds ratios for iron deficiency over all donor visits. The odds for ferritin <26 or <12 mg/L decreased more than 80% in the 19 mg iron group (P<0.01 for both ferritin measurements) and the 38 mg iron group (P<0.01 for both).
The odds for ferritin <26 or <12 mg/L decreased about 50% in the iron status letter arm (P<0.01 for both).
And the odds for ferritin <12 mg/L increased about 50% in the control groups (P<0.01 for both the placebo and control letter groups). However, there was no significant difference for ferritin <26 mg/L in either control group.
Lastly, the researchers performed longitudinal modeling of hemoglobin. They found that hemoglobin increased >0.03 g/dL in the 19 mg and 38 mg iron arms (P<0.01 for both), decreasing the odds for low hemoglobin deferral about 50%.
Hemoglobin decreased >0.3 g/dL in the control groups (P<0.0001 for both the placebo and control letter groups), increasing the odds for low hemoglobin deferral about 70%.
“Interestingly, [being] in the iron status letter group did not affect hemoglobin that much in the longitudinal modeling of the donors,” Dr Mast noted.
In closing, he pointed out that the 19 mg and 38 mg iron pills were equally effective for mitigating iron deficiency and improving hemoglobin in these blood donors.
“From a physiology point of view, I think this is one of the most important results of this study,” Dr Mast said. “There’s absolutely no difference. There was no trend for 38 mg to be better than 19 in any analysis that we did.”
“There’s lots of reasons that could be happening, but I think it’s scientifically interesting and operationally interesting. And it’s important because we can tell donors—ask them to take a multivitamin with 19 mg of iron, and that will be sufficient to treat iron deficiency.”
Assay can detect and classify DOACs
Photo by Juan D. Alfonso
ANAHEIM, CA—A new assay can detect and classify direct oral anticoagulants (DOACs) quickly and effectively, according to researchers.
In tests, the assay detected DOACs with greater than 90% sensitivity and specificity.
The assay classified the direct thrombin inhibitor (DTI) dabigatran correctly 100% of the time and classified factor Xa inhibitors (anti-Xa), which included rivaroxaban and apixaban, correctly 92% of the time.
The researchers believe this assay has the potential to be an effective tool for treating patients on DOACs who experience trauma or stroke, as well as those who require emergency/urgent surgery. And the ability to identify the type of anticoagulant a patient is taking can guide the reversal strategy.
Fowzia Zaman, PhD, of Haemonetics Corporation in Rosemont, Illinois, described the assay at the 2015 AABB Annual Meeting (abstract S60-030K). Haemonetics is the company developing the assay, and this research was supported by the company.
About the assay
“The current coagulation assays are not very sensitive to DOACs, especially in the therapeutic range,” Dr Zaman said. “Right now, there is no assay available that can classify the DOACs. This new assay can both detect and classify, and it will classify the DOACs either as a DTI or an anti-Xa.”
The assay is performed using Haemonetics’ TEG 6s system, a fully automated system for evaluating anticoagulation in a patient. It is based on viscoelasticity measurements using resonance frequency and disposable microfluidic cartridges. Each cartridge has 4 channels, and 2 of the channels are used for detection and classification.
Detection is performed using a factor Xa-based reagent, and classification utilizes an Ecarin-based reagent. All of the reagents are contained within the channel, so there is no reagent preparation required.
Each cartridge is loaded into the unit, and citrated whole blood is added, either with a transfer pipette or a syringe, to start the assay.
Reaction time (R-time) is used for detection and classification. R is defined as the time from the start of the sample run to the point of clot formation. It corresponds to an amplitude of 2 mm on the TEG tracing. It represents the initial enzymatic phase of clotting, and it is recorded in minutes.
Study population
The researchers tested the assay in 26 healthy subjects, 25 patients on DTI (all dabigatran), and 40 on anti-Xa therapy (24 on rivaroxaban, 16 on apixaban).
For healthy subjects, the mean age was 41±13, and 46% of subjects are male. Forty-six percent are Caucasian, 39% are African American, and 15% are Asian/“other”. The partial thromboplastin time (PTT) for these subjects was within the normal range, at 27.2±1.8 seconds.
In the DOAC population, the mean age was 68±12 for the anti-Xa group and 69±10 for the DTI group. Fifty percent and 72%, respectively, are male. And 50% and 64%, respectively, are Caucasian.
Most of the patients receiving DOACs were taking them for atrial fibrillation—88% in the anti-Xa group and 84% in the DTI group. Other underlying conditions were coronary artery disease—28% and 32%, respectively—and hypertension—60% and 64%, respectively.
Some patients were taking aspirin in addition to DOACs—30% in the anti-Xa group and 24% in the DTI group. And some were taking P2Y12 inhibitors—20% in the anti-Xa group and 24% in the DTI group.
The PTT was 30.4±4.6 seconds for the anti-Xa group and 36.6±7 seconds for the DTI group. Creatinine levels were 1.07±0.6 mg/dL and 1.05±0.2 mg/dL, respectively.
Assay results
The researchers analyzed citrated whole blood from the healthy volunteers to establish the baseline reference range. The cutoff for detection was 1.95 minutes, and the cutoff for classification was 1.9 minutes.
“What this means is that a person who does not have DOAC in their system should have an R-time of less than or equal to 1.95 minutes,” Dr Zaman explained.
The researchers also developed an algorithm for the detection and classification of DOACs. According to this algorithm, healthy subjects would have a short R-time in the detection channel and the classification channel.
Patients on anti-Xa would have a long R-time in the detection channel but a short R-time in the classification channel. And patients on a DTI would have a long R-time in both the detection channel and the classification channel.
The researchers found that, in the detection channel, on average, R-time was increased 66% for dabigatran, 125% for rivaroxaban, and 100% for apixaban, compared to the reference range. But the degree of elongation was dependent on the individual patient and the time from last DOAC dosage.
Using a cutoff of 2 minutes, the detection channel demonstrated 94% sensitivity and 96% specificity for all the DOACs combined.
“What this means is that, when a patient had a DOAC in their system, the assay was able to pick it up 94% of the time,” Dr Zaman explained.
In addition, the assay detected dabigatran correctly 100% of the time and anti-Xa therapy correctly 92% of the time.
“This TEG 6s DOAC assay is highly sensitive and specific for detecting and classifying DOACs,” Dr Zaman said in closing. “[T]he cutoffs for both the channels are close to 2 minutes, which means clinically relevant results are available within 5 minutes.”
“There is no reagent prep necessary, and it utilizes whole blood, so [there is] no spinning down to plasma. Therefore, it has the potential to be a point-of-care assay.”
Photo by Juan D. Alfonso
ANAHEIM, CA—A new assay can detect and classify direct oral anticoagulants (DOACs) quickly and effectively, according to researchers.
In tests, the assay detected DOACs with greater than 90% sensitivity and specificity.
The assay classified the direct thrombin inhibitor (DTI) dabigatran correctly 100% of the time and classified factor Xa inhibitors (anti-Xa), which included rivaroxaban and apixaban, correctly 92% of the time.
The researchers believe this assay has the potential to be an effective tool for treating patients on DOACs who experience trauma or stroke, as well as those who require emergency/urgent surgery. And the ability to identify the type of anticoagulant a patient is taking can guide the reversal strategy.
Fowzia Zaman, PhD, of Haemonetics Corporation in Rosemont, Illinois, described the assay at the 2015 AABB Annual Meeting (abstract S60-030K). Haemonetics is the company developing the assay, and this research was supported by the company.
About the assay
“The current coagulation assays are not very sensitive to DOACs, especially in the therapeutic range,” Dr Zaman said. “Right now, there is no assay available that can classify the DOACs. This new assay can both detect and classify, and it will classify the DOACs either as a DTI or an anti-Xa.”
The assay is performed using Haemonetics’ TEG 6s system, a fully automated system for evaluating anticoagulation in a patient. It is based on viscoelasticity measurements using resonance frequency and disposable microfluidic cartridges. Each cartridge has 4 channels, and 2 of the channels are used for detection and classification.
Detection is performed using a factor Xa-based reagent, and classification utilizes an Ecarin-based reagent. All of the reagents are contained within the channel, so there is no reagent preparation required.
Each cartridge is loaded into the unit, and citrated whole blood is added, either with a transfer pipette or a syringe, to start the assay.
Reaction time (R-time) is used for detection and classification. R is defined as the time from the start of the sample run to the point of clot formation. It corresponds to an amplitude of 2 mm on the TEG tracing. It represents the initial enzymatic phase of clotting, and it is recorded in minutes.
Study population
The researchers tested the assay in 26 healthy subjects, 25 patients on DTI (all dabigatran), and 40 on anti-Xa therapy (24 on rivaroxaban, 16 on apixaban).
For healthy subjects, the mean age was 41±13, and 46% of subjects are male. Forty-six percent are Caucasian, 39% are African American, and 15% are Asian/“other”. The partial thromboplastin time (PTT) for these subjects was within the normal range, at 27.2±1.8 seconds.
In the DOAC population, the mean age was 68±12 for the anti-Xa group and 69±10 for the DTI group. Fifty percent and 72%, respectively, are male. And 50% and 64%, respectively, are Caucasian.
Most of the patients receiving DOACs were taking them for atrial fibrillation—88% in the anti-Xa group and 84% in the DTI group. Other underlying conditions were coronary artery disease—28% and 32%, respectively—and hypertension—60% and 64%, respectively.
Some patients were taking aspirin in addition to DOACs—30% in the anti-Xa group and 24% in the DTI group. And some were taking P2Y12 inhibitors—20% in the anti-Xa group and 24% in the DTI group.
The PTT was 30.4±4.6 seconds for the anti-Xa group and 36.6±7 seconds for the DTI group. Creatinine levels were 1.07±0.6 mg/dL and 1.05±0.2 mg/dL, respectively.
Assay results
The researchers analyzed citrated whole blood from the healthy volunteers to establish the baseline reference range. The cutoff for detection was 1.95 minutes, and the cutoff for classification was 1.9 minutes.
“What this means is that a person who does not have DOAC in their system should have an R-time of less than or equal to 1.95 minutes,” Dr Zaman explained.
The researchers also developed an algorithm for the detection and classification of DOACs. According to this algorithm, healthy subjects would have a short R-time in the detection channel and the classification channel.
Patients on anti-Xa would have a long R-time in the detection channel but a short R-time in the classification channel. And patients on a DTI would have a long R-time in both the detection channel and the classification channel.
The researchers found that, in the detection channel, on average, R-time was increased 66% for dabigatran, 125% for rivaroxaban, and 100% for apixaban, compared to the reference range. But the degree of elongation was dependent on the individual patient and the time from last DOAC dosage.
Using a cutoff of 2 minutes, the detection channel demonstrated 94% sensitivity and 96% specificity for all the DOACs combined.
“What this means is that, when a patient had a DOAC in their system, the assay was able to pick it up 94% of the time,” Dr Zaman explained.
In addition, the assay detected dabigatran correctly 100% of the time and anti-Xa therapy correctly 92% of the time.
“This TEG 6s DOAC assay is highly sensitive and specific for detecting and classifying DOACs,” Dr Zaman said in closing. “[T]he cutoffs for both the channels are close to 2 minutes, which means clinically relevant results are available within 5 minutes.”
“There is no reagent prep necessary, and it utilizes whole blood, so [there is] no spinning down to plasma. Therefore, it has the potential to be a point-of-care assay.”
Photo by Juan D. Alfonso
ANAHEIM, CA—A new assay can detect and classify direct oral anticoagulants (DOACs) quickly and effectively, according to researchers.
In tests, the assay detected DOACs with greater than 90% sensitivity and specificity.
The assay classified the direct thrombin inhibitor (DTI) dabigatran correctly 100% of the time and classified factor Xa inhibitors (anti-Xa), which included rivaroxaban and apixaban, correctly 92% of the time.
The researchers believe this assay has the potential to be an effective tool for treating patients on DOACs who experience trauma or stroke, as well as those who require emergency/urgent surgery. And the ability to identify the type of anticoagulant a patient is taking can guide the reversal strategy.
Fowzia Zaman, PhD, of Haemonetics Corporation in Rosemont, Illinois, described the assay at the 2015 AABB Annual Meeting (abstract S60-030K). Haemonetics is the company developing the assay, and this research was supported by the company.
About the assay
“The current coagulation assays are not very sensitive to DOACs, especially in the therapeutic range,” Dr Zaman said. “Right now, there is no assay available that can classify the DOACs. This new assay can both detect and classify, and it will classify the DOACs either as a DTI or an anti-Xa.”
The assay is performed using Haemonetics’ TEG 6s system, a fully automated system for evaluating anticoagulation in a patient. It is based on viscoelasticity measurements using resonance frequency and disposable microfluidic cartridges. Each cartridge has 4 channels, and 2 of the channels are used for detection and classification.
Detection is performed using a factor Xa-based reagent, and classification utilizes an Ecarin-based reagent. All of the reagents are contained within the channel, so there is no reagent preparation required.
Each cartridge is loaded into the unit, and citrated whole blood is added, either with a transfer pipette or a syringe, to start the assay.
Reaction time (R-time) is used for detection and classification. R is defined as the time from the start of the sample run to the point of clot formation. It corresponds to an amplitude of 2 mm on the TEG tracing. It represents the initial enzymatic phase of clotting, and it is recorded in minutes.
Study population
The researchers tested the assay in 26 healthy subjects, 25 patients on DTI (all dabigatran), and 40 on anti-Xa therapy (24 on rivaroxaban, 16 on apixaban).
For healthy subjects, the mean age was 41±13, and 46% of subjects are male. Forty-six percent are Caucasian, 39% are African American, and 15% are Asian/“other”. The partial thromboplastin time (PTT) for these subjects was within the normal range, at 27.2±1.8 seconds.
In the DOAC population, the mean age was 68±12 for the anti-Xa group and 69±10 for the DTI group. Fifty percent and 72%, respectively, are male. And 50% and 64%, respectively, are Caucasian.
Most of the patients receiving DOACs were taking them for atrial fibrillation—88% in the anti-Xa group and 84% in the DTI group. Other underlying conditions were coronary artery disease—28% and 32%, respectively—and hypertension—60% and 64%, respectively.
Some patients were taking aspirin in addition to DOACs—30% in the anti-Xa group and 24% in the DTI group. And some were taking P2Y12 inhibitors—20% in the anti-Xa group and 24% in the DTI group.
The PTT was 30.4±4.6 seconds for the anti-Xa group and 36.6±7 seconds for the DTI group. Creatinine levels were 1.07±0.6 mg/dL and 1.05±0.2 mg/dL, respectively.
Assay results
The researchers analyzed citrated whole blood from the healthy volunteers to establish the baseline reference range. The cutoff for detection was 1.95 minutes, and the cutoff for classification was 1.9 minutes.
“What this means is that a person who does not have DOAC in their system should have an R-time of less than or equal to 1.95 minutes,” Dr Zaman explained.
The researchers also developed an algorithm for the detection and classification of DOACs. According to this algorithm, healthy subjects would have a short R-time in the detection channel and the classification channel.
Patients on anti-Xa would have a long R-time in the detection channel but a short R-time in the classification channel. And patients on a DTI would have a long R-time in both the detection channel and the classification channel.
The researchers found that, in the detection channel, on average, R-time was increased 66% for dabigatran, 125% for rivaroxaban, and 100% for apixaban, compared to the reference range. But the degree of elongation was dependent on the individual patient and the time from last DOAC dosage.
Using a cutoff of 2 minutes, the detection channel demonstrated 94% sensitivity and 96% specificity for all the DOACs combined.
“What this means is that, when a patient had a DOAC in their system, the assay was able to pick it up 94% of the time,” Dr Zaman explained.
In addition, the assay detected dabigatran correctly 100% of the time and anti-Xa therapy correctly 92% of the time.
“This TEG 6s DOAC assay is highly sensitive and specific for detecting and classifying DOACs,” Dr Zaman said in closing. “[T]he cutoffs for both the channels are close to 2 minutes, which means clinically relevant results are available within 5 minutes.”
“There is no reagent prep necessary, and it utilizes whole blood, so [there is] no spinning down to plasma. Therefore, it has the potential to be a point-of-care assay.”
Blood donors’ genetic background affects hemolysis
ANAHEIM, CA—Interim results of a large study suggest a blood donor’s genetic background and frequency of donation influence
red blood cell (RBC) storage and stress hemolysis.
Investigators found that donor ethnicity and gender both affected hemolysis, but the effects sometimes differed between storage and stress hemolysis.
Similarly, RBCs from frequent donors were more susceptible to storage and osmotic hemolysis but less susceptible to oxidative hemolysis.
Tamir Kanias, PhD, of the University of Pittsburgh in Pennsylvania, presented these findings at the 2015 AABB Annual Meeting (abstract S73-040A).
“We now know that some donor red cells store very well, and, even after 42 days of storage, there is hardly any hemolysis,” Dr Kanias noted. “[But for] some donors, their red cells are starting to degrade maybe 5 or 6 days after collection.”
With that in mind, Dr Kanias and his colleagues set out to define the genetic and metabolic basis for donor-specific differences in hemolysis in stored RBCs.
They analyzed RBCs collected at 4 centers as part of the REDS-III study. The team took 15 mL of RBCs from fresh units donated for transfusion and stored the cells in transfer bags to measure hemolysis. The transfer bags are miniature versions of the bags used to store RBCs for transfusion.
Dr Kanias presented interim findings in samples from more than 8000 donors. He and his colleagues looked at donor ethnicity, gender, and age. The team also assessed whether subjects were “high-intensity” donors, which was defined as donating RBCs 10 or more times in the previous 24 months without a low-hemoglobin deferral.
The donors’ samples were stored for 39 to 42 days before the investigators assessed hemolysis. They measured end-of-storage hemolysis in unwashed red cells, then washed the RBCs and assessed osmotic hemolysis (Pink test) and oxidative hemolysis (AAPH).
Ethnicity and intensity
Tests showed that RBCs from African American and high-intensity donors (more than 90% of whom were Caucasian) were more susceptible to storage hemolysis than RBCs from the other donor groups analyzed.
RBCs from Caucasian donors and high-intensity donors were susceptible to osmotic hemolysis, while RBCs from African American and Asian donors were more resistant.
“We hypothesize that this [resistance] may be related to some of these donors carrying traits for sickle cell disease or thalassemia,” Dr Kanias said. “Both diseases are known to render red cells more resistant to osmotic hemolysis, but of course, it could be [explained by] new mutations that we don’t know of.”
RBCs from Hispanic donors and African American donors were more susceptible to oxidative hemolysis, but the opposite was true of RBCs from high-intensity donors.
“What was really interesting is that the high-intensity donors that had higher end-of-storage hemolysis and higher susceptibility to osmotic hemolysis actually became more resistant to oxidative hemolysis,” Dr Kanias said.
“It is possible that the lower levels of iron in the red cells of these donors actually protects from oxidative hemolysis. Iron is redox-active, and a lot of the AAPH-induced hemolysis is mediated by iron interactions.”
Group comparisons
Looking at the data another way, the investigators compared samples from Caucasians to samples from the other ethnic groups and the high-intensity donors.
RBCs from African American donors had significantly higher storage hemolysis (P=0.0078), lower osmotic hemolysis (P<0.0001), and higher oxidative hemolysis (P=0.0008) than RBCs from Caucasians.
RBCs from Asians had significantly lower osmotic hemolysis (P<0.0001) than Caucasian RBCs, but there was no significant difference in storage hemolysis (P=0.69) or oxidative hemolysis (P=0.41) between the 2 groups.
RBCs from Hispanic donors were significantly more susceptible to oxidative hemolysis (P<0.0001) than Caucasian RBCs, but there was no significant difference between the groups with regard to storage hemolysis (P=0.89) or osmotic hemolysis (P=0.10).
RBCs from high-intensity donors had significantly higher storage hemolysis (P<0.0001) and lower oxidative hemolysis (P<0.0001) than Caucasian RBCs. There was no significant difference in osmotic hemolysis (P=0.84)
Gender and age
As in other studies, Dr Kanias and his colleagues found that RBCs from females hemolyzed significantly less than RBCs from males. This was true for storage hemolysis, osmotic hemolysis, and oxidative hemolysis (P<0.0001 for all).
“Just to note, the gender effect was more dramatic in storage and osmotic rather than oxidative, which suggests that the gender effect is more on the membrane or membrane integrity rather than antioxidant capacity,” Dr Kanias said.
He and his colleagues then looked at donor age and observed the gender effect at every age analyzed (18 to 65+). He noted that hemolysis fluctuated throughout the age groups, so the investigators couldn’t draw any concrete conclusions about hemolysis and donor age.
“One interesting thing to note is that, in all the assays, in young males—like around 20—there’s an increase in hemolysis where there’s a decrease in females,” Dr Kanias said. “This may be related to the effect of sex hormones.”
Genetic modifiers
The investigators also assessed how the 3 hemolytic assays relate to each other and found very weak correlations between them. Pearson correlations were 0.12 between storage and osmotic hemolysis, 0.0041 between storage and oxidative hemolysis, and 0.058 between osmotic and oxidative hemolysis.
“This is kind of cool because it may mean that there is a different genetic modifier affecting each of these phenomena,” Dr Kanias said.
He and his colleagues are now working to identify genetic and metabolic modifiers of hemolysis.
ANAHEIM, CA—Interim results of a large study suggest a blood donor’s genetic background and frequency of donation influence
red blood cell (RBC) storage and stress hemolysis.
Investigators found that donor ethnicity and gender both affected hemolysis, but the effects sometimes differed between storage and stress hemolysis.
Similarly, RBCs from frequent donors were more susceptible to storage and osmotic hemolysis but less susceptible to oxidative hemolysis.
Tamir Kanias, PhD, of the University of Pittsburgh in Pennsylvania, presented these findings at the 2015 AABB Annual Meeting (abstract S73-040A).
“We now know that some donor red cells store very well, and, even after 42 days of storage, there is hardly any hemolysis,” Dr Kanias noted. “[But for] some donors, their red cells are starting to degrade maybe 5 or 6 days after collection.”
With that in mind, Dr Kanias and his colleagues set out to define the genetic and metabolic basis for donor-specific differences in hemolysis in stored RBCs.
They analyzed RBCs collected at 4 centers as part of the REDS-III study. The team took 15 mL of RBCs from fresh units donated for transfusion and stored the cells in transfer bags to measure hemolysis. The transfer bags are miniature versions of the bags used to store RBCs for transfusion.
Dr Kanias presented interim findings in samples from more than 8000 donors. He and his colleagues looked at donor ethnicity, gender, and age. The team also assessed whether subjects were “high-intensity” donors, which was defined as donating RBCs 10 or more times in the previous 24 months without a low-hemoglobin deferral.
The donors’ samples were stored for 39 to 42 days before the investigators assessed hemolysis. They measured end-of-storage hemolysis in unwashed red cells, then washed the RBCs and assessed osmotic hemolysis (Pink test) and oxidative hemolysis (AAPH).
Ethnicity and intensity
Tests showed that RBCs from African American and high-intensity donors (more than 90% of whom were Caucasian) were more susceptible to storage hemolysis than RBCs from the other donor groups analyzed.
RBCs from Caucasian donors and high-intensity donors were susceptible to osmotic hemolysis, while RBCs from African American and Asian donors were more resistant.
“We hypothesize that this [resistance] may be related to some of these donors carrying traits for sickle cell disease or thalassemia,” Dr Kanias said. “Both diseases are known to render red cells more resistant to osmotic hemolysis, but of course, it could be [explained by] new mutations that we don’t know of.”
RBCs from Hispanic donors and African American donors were more susceptible to oxidative hemolysis, but the opposite was true of RBCs from high-intensity donors.
“What was really interesting is that the high-intensity donors that had higher end-of-storage hemolysis and higher susceptibility to osmotic hemolysis actually became more resistant to oxidative hemolysis,” Dr Kanias said.
“It is possible that the lower levels of iron in the red cells of these donors actually protects from oxidative hemolysis. Iron is redox-active, and a lot of the AAPH-induced hemolysis is mediated by iron interactions.”
Group comparisons
Looking at the data another way, the investigators compared samples from Caucasians to samples from the other ethnic groups and the high-intensity donors.
RBCs from African American donors had significantly higher storage hemolysis (P=0.0078), lower osmotic hemolysis (P<0.0001), and higher oxidative hemolysis (P=0.0008) than RBCs from Caucasians.
RBCs from Asians had significantly lower osmotic hemolysis (P<0.0001) than Caucasian RBCs, but there was no significant difference in storage hemolysis (P=0.69) or oxidative hemolysis (P=0.41) between the 2 groups.
RBCs from Hispanic donors were significantly more susceptible to oxidative hemolysis (P<0.0001) than Caucasian RBCs, but there was no significant difference between the groups with regard to storage hemolysis (P=0.89) or osmotic hemolysis (P=0.10).
RBCs from high-intensity donors had significantly higher storage hemolysis (P<0.0001) and lower oxidative hemolysis (P<0.0001) than Caucasian RBCs. There was no significant difference in osmotic hemolysis (P=0.84)
Gender and age
As in other studies, Dr Kanias and his colleagues found that RBCs from females hemolyzed significantly less than RBCs from males. This was true for storage hemolysis, osmotic hemolysis, and oxidative hemolysis (P<0.0001 for all).
“Just to note, the gender effect was more dramatic in storage and osmotic rather than oxidative, which suggests that the gender effect is more on the membrane or membrane integrity rather than antioxidant capacity,” Dr Kanias said.
He and his colleagues then looked at donor age and observed the gender effect at every age analyzed (18 to 65+). He noted that hemolysis fluctuated throughout the age groups, so the investigators couldn’t draw any concrete conclusions about hemolysis and donor age.
“One interesting thing to note is that, in all the assays, in young males—like around 20—there’s an increase in hemolysis where there’s a decrease in females,” Dr Kanias said. “This may be related to the effect of sex hormones.”
Genetic modifiers
The investigators also assessed how the 3 hemolytic assays relate to each other and found very weak correlations between them. Pearson correlations were 0.12 between storage and osmotic hemolysis, 0.0041 between storage and oxidative hemolysis, and 0.058 between osmotic and oxidative hemolysis.
“This is kind of cool because it may mean that there is a different genetic modifier affecting each of these phenomena,” Dr Kanias said.
He and his colleagues are now working to identify genetic and metabolic modifiers of hemolysis.
ANAHEIM, CA—Interim results of a large study suggest a blood donor’s genetic background and frequency of donation influence
red blood cell (RBC) storage and stress hemolysis.
Investigators found that donor ethnicity and gender both affected hemolysis, but the effects sometimes differed between storage and stress hemolysis.
Similarly, RBCs from frequent donors were more susceptible to storage and osmotic hemolysis but less susceptible to oxidative hemolysis.
Tamir Kanias, PhD, of the University of Pittsburgh in Pennsylvania, presented these findings at the 2015 AABB Annual Meeting (abstract S73-040A).
“We now know that some donor red cells store very well, and, even after 42 days of storage, there is hardly any hemolysis,” Dr Kanias noted. “[But for] some donors, their red cells are starting to degrade maybe 5 or 6 days after collection.”
With that in mind, Dr Kanias and his colleagues set out to define the genetic and metabolic basis for donor-specific differences in hemolysis in stored RBCs.
They analyzed RBCs collected at 4 centers as part of the REDS-III study. The team took 15 mL of RBCs from fresh units donated for transfusion and stored the cells in transfer bags to measure hemolysis. The transfer bags are miniature versions of the bags used to store RBCs for transfusion.
Dr Kanias presented interim findings in samples from more than 8000 donors. He and his colleagues looked at donor ethnicity, gender, and age. The team also assessed whether subjects were “high-intensity” donors, which was defined as donating RBCs 10 or more times in the previous 24 months without a low-hemoglobin deferral.
The donors’ samples were stored for 39 to 42 days before the investigators assessed hemolysis. They measured end-of-storage hemolysis in unwashed red cells, then washed the RBCs and assessed osmotic hemolysis (Pink test) and oxidative hemolysis (AAPH).
Ethnicity and intensity
Tests showed that RBCs from African American and high-intensity donors (more than 90% of whom were Caucasian) were more susceptible to storage hemolysis than RBCs from the other donor groups analyzed.
RBCs from Caucasian donors and high-intensity donors were susceptible to osmotic hemolysis, while RBCs from African American and Asian donors were more resistant.
“We hypothesize that this [resistance] may be related to some of these donors carrying traits for sickle cell disease or thalassemia,” Dr Kanias said. “Both diseases are known to render red cells more resistant to osmotic hemolysis, but of course, it could be [explained by] new mutations that we don’t know of.”
RBCs from Hispanic donors and African American donors were more susceptible to oxidative hemolysis, but the opposite was true of RBCs from high-intensity donors.
“What was really interesting is that the high-intensity donors that had higher end-of-storage hemolysis and higher susceptibility to osmotic hemolysis actually became more resistant to oxidative hemolysis,” Dr Kanias said.
“It is possible that the lower levels of iron in the red cells of these donors actually protects from oxidative hemolysis. Iron is redox-active, and a lot of the AAPH-induced hemolysis is mediated by iron interactions.”
Group comparisons
Looking at the data another way, the investigators compared samples from Caucasians to samples from the other ethnic groups and the high-intensity donors.
RBCs from African American donors had significantly higher storage hemolysis (P=0.0078), lower osmotic hemolysis (P<0.0001), and higher oxidative hemolysis (P=0.0008) than RBCs from Caucasians.
RBCs from Asians had significantly lower osmotic hemolysis (P<0.0001) than Caucasian RBCs, but there was no significant difference in storage hemolysis (P=0.69) or oxidative hemolysis (P=0.41) between the 2 groups.
RBCs from Hispanic donors were significantly more susceptible to oxidative hemolysis (P<0.0001) than Caucasian RBCs, but there was no significant difference between the groups with regard to storage hemolysis (P=0.89) or osmotic hemolysis (P=0.10).
RBCs from high-intensity donors had significantly higher storage hemolysis (P<0.0001) and lower oxidative hemolysis (P<0.0001) than Caucasian RBCs. There was no significant difference in osmotic hemolysis (P=0.84)
Gender and age
As in other studies, Dr Kanias and his colleagues found that RBCs from females hemolyzed significantly less than RBCs from males. This was true for storage hemolysis, osmotic hemolysis, and oxidative hemolysis (P<0.0001 for all).
“Just to note, the gender effect was more dramatic in storage and osmotic rather than oxidative, which suggests that the gender effect is more on the membrane or membrane integrity rather than antioxidant capacity,” Dr Kanias said.
He and his colleagues then looked at donor age and observed the gender effect at every age analyzed (18 to 65+). He noted that hemolysis fluctuated throughout the age groups, so the investigators couldn’t draw any concrete conclusions about hemolysis and donor age.
“One interesting thing to note is that, in all the assays, in young males—like around 20—there’s an increase in hemolysis where there’s a decrease in females,” Dr Kanias said. “This may be related to the effect of sex hormones.”
Genetic modifiers
The investigators also assessed how the 3 hemolytic assays relate to each other and found very weak correlations between them. Pearson correlations were 0.12 between storage and osmotic hemolysis, 0.0041 between storage and oxidative hemolysis, and 0.058 between osmotic and oxidative hemolysis.
“This is kind of cool because it may mean that there is a different genetic modifier affecting each of these phenomena,” Dr Kanias said.
He and his colleagues are now working to identify genetic and metabolic modifiers of hemolysis.
Canada may shorten deferral for MSM blood donors
Photo by Charles Haymond
ANAHEIM, CA—Lifting the lifetime ban on blood donations from men who have sex with men (MSM) has not altered the safety of the blood supply in Canada, according to a new study.
The study showed no increase in the rate of HIV-positive blood donations since Canada changed its policy regarding MSM blood donors, allowing MSM to donate if they have not had sexual contact with another man in the last 5 years.
Because of this finding, Canada may shorten the deferral period for MSM blood donors to 1 year, according to Sheila F. O’Brien, PhD, of Canadian Blood Services in Ottawa, Ontario, Canada.
Dr O’Brien mentioned this possibility and presented data from the study at the 2015 AABB Annual Meeting (abstract S35-030E*).
Prior to 2013, MSM in Canada were not allowed to donate blood if they had any sexual contact with another male since 1977. Females were barred from donating if, in the last year, they had sexual contact with a man who had sex with another man after 1977.
On July 22, 2013, Canada changed this policy so that MSM can donate blood if they have abstained from sexual contact with another man for the past 5 years. The deferral period for females is still 12 months if they have had sex with a man who has had sex with another man in the last 5 years, but there is no deferral if the man had sex with another man more than 5 years before.
To evaluate the impact of this policy change, Dr O’Brien and her colleagues assessed compliance with the MSM criteria before and after the change, as well as the number of HIV-positive blood donations before and after the change.
The researchers also assessed the number of donors who would have been deferred according to the old MSM criteria but donated blood under the new criteria.
MSM history
The researchers selected random male donors of whole blood each month from October 2012 to February 2013 (pre-change) and from October 2014 to February 2015 (post-change). These donors were invited to complete an anonymous online survey about their MSM history.
The survey was completed by 9669 donors before the policy change and 6881 donors after the change. There were 77 donors with MSM history before the change (20% first-time donors, 80% repeat) and 75 donors with MSM history after the change (22% first-time, 78% repeat).
Compliance with policy
After the change in policy for MSM blood donors, there was no significant change in the proportion of donors who had recent MSM history but donated anyway (non-compliant). Before the change, 0.37% of blood donors had an MSM partner in the last 5 years, compared to 0.43% after the change (P=0.54).
However, there was a significant change in the proportion of blood donors with MSM history further in the past. Before the MSM policy change, 0.42% of donors had an MSM partner but not in the last 5 years, compared to 0.66% of donors after the change (P=0.04).
“So we have an improvement in compliance, but it’s mainly because the donors are no longer deferrable,” Dr O’Brien explained.
“Donating while ineligible because of MSM history is actually quite rare, and the percentage of donors with MSM history in the last 5 years did not change when we changed the criteria. But we did see a modest increase in newly eligible MSM, so those that had more than 5 years since their last male-to-male sex.”
In all, there were 112 donors who were newly eligible due to the policy change and did, in fact, donate blood between July 22, 2013 and July 21, 2015. Five of these donors were females who had sexual contact with MSMs.
There were 70 “reinstated” donors in the first year after the policy change and 42 in the second year.
HIV-positive donations
The researchers monitored HIV rates in all blood donations from January 2010 to March 2015.
The rates of HIV-positive donations were as follows: 0.20 for 2010 (2/989,916), 0.50 for 2011 (5/995,122), 0.51 for 2012 (5/987,527), 0 (0/525,337) from January 1, 2013 to July 21, 2013 (before the policy change), 0.54 from July 22, 2013 to July 21, 2014 (5/929,656), and 0.22 from July 22, 2014 to July 21, 2015 (2/893,513).
“So absolutely no change in HIV rate following implementation of our 5-year deferral,” Dr O’Brien said.
In all, there were 7 HIV-positive donations after the policy change. Four were from male donors, and 3 were from females.
Three of the male donors (2 first-time donors, 1 repeat) denied having MSM risk factors, and 1 first-time male donor was aware he was HIV-positive at the time of donation. This man said he donated to determine if his HIV medication was working.
Two of the females were repeat donors, and 1 was a first-timer. The first-time donor did not acknowledge any MSM risk factors. One of the repeat donors had a sexual relationship with a bisexual male who was HIV-positive. The other repeat donor had multiple sexual partners, 1 of whom was known to be hepatitis C-positive.
Future policy change
Dr O’Brien noted that the LGBTQ community in Canada has advocated abolishing the deferral period for MSM blood donors or changing to a risk-based policy that would allow more individuals with MSM history to donate blood.
She said the combined blood services in Canada—Canadian Blood Services and Héma-Québec—are now considering a 12-month deferral period for individuals with MSM history.
“We’re pretty sure we’re going to go ahead,” she noted.
However, the groups must submit a policy request to Health Canada, which will ultimately make the decision.
*Data in the abstract differ from data presented at the meeting.
Photo by Charles Haymond
ANAHEIM, CA—Lifting the lifetime ban on blood donations from men who have sex with men (MSM) has not altered the safety of the blood supply in Canada, according to a new study.
The study showed no increase in the rate of HIV-positive blood donations since Canada changed its policy regarding MSM blood donors, allowing MSM to donate if they have not had sexual contact with another man in the last 5 years.
Because of this finding, Canada may shorten the deferral period for MSM blood donors to 1 year, according to Sheila F. O’Brien, PhD, of Canadian Blood Services in Ottawa, Ontario, Canada.
Dr O’Brien mentioned this possibility and presented data from the study at the 2015 AABB Annual Meeting (abstract S35-030E*).
Prior to 2013, MSM in Canada were not allowed to donate blood if they had any sexual contact with another male since 1977. Females were barred from donating if, in the last year, they had sexual contact with a man who had sex with another man after 1977.
On July 22, 2013, Canada changed this policy so that MSM can donate blood if they have abstained from sexual contact with another man for the past 5 years. The deferral period for females is still 12 months if they have had sex with a man who has had sex with another man in the last 5 years, but there is no deferral if the man had sex with another man more than 5 years before.
To evaluate the impact of this policy change, Dr O’Brien and her colleagues assessed compliance with the MSM criteria before and after the change, as well as the number of HIV-positive blood donations before and after the change.
The researchers also assessed the number of donors who would have been deferred according to the old MSM criteria but donated blood under the new criteria.
MSM history
The researchers selected random male donors of whole blood each month from October 2012 to February 2013 (pre-change) and from October 2014 to February 2015 (post-change). These donors were invited to complete an anonymous online survey about their MSM history.
The survey was completed by 9669 donors before the policy change and 6881 donors after the change. There were 77 donors with MSM history before the change (20% first-time donors, 80% repeat) and 75 donors with MSM history after the change (22% first-time, 78% repeat).
Compliance with policy
After the change in policy for MSM blood donors, there was no significant change in the proportion of donors who had recent MSM history but donated anyway (non-compliant). Before the change, 0.37% of blood donors had an MSM partner in the last 5 years, compared to 0.43% after the change (P=0.54).
However, there was a significant change in the proportion of blood donors with MSM history further in the past. Before the MSM policy change, 0.42% of donors had an MSM partner but not in the last 5 years, compared to 0.66% of donors after the change (P=0.04).
“So we have an improvement in compliance, but it’s mainly because the donors are no longer deferrable,” Dr O’Brien explained.
“Donating while ineligible because of MSM history is actually quite rare, and the percentage of donors with MSM history in the last 5 years did not change when we changed the criteria. But we did see a modest increase in newly eligible MSM, so those that had more than 5 years since their last male-to-male sex.”
In all, there were 112 donors who were newly eligible due to the policy change and did, in fact, donate blood between July 22, 2013 and July 21, 2015. Five of these donors were females who had sexual contact with MSMs.
There were 70 “reinstated” donors in the first year after the policy change and 42 in the second year.
HIV-positive donations
The researchers monitored HIV rates in all blood donations from January 2010 to March 2015.
The rates of HIV-positive donations were as follows: 0.20 for 2010 (2/989,916), 0.50 for 2011 (5/995,122), 0.51 for 2012 (5/987,527), 0 (0/525,337) from January 1, 2013 to July 21, 2013 (before the policy change), 0.54 from July 22, 2013 to July 21, 2014 (5/929,656), and 0.22 from July 22, 2014 to July 21, 2015 (2/893,513).
“So absolutely no change in HIV rate following implementation of our 5-year deferral,” Dr O’Brien said.
In all, there were 7 HIV-positive donations after the policy change. Four were from male donors, and 3 were from females.
Three of the male donors (2 first-time donors, 1 repeat) denied having MSM risk factors, and 1 first-time male donor was aware he was HIV-positive at the time of donation. This man said he donated to determine if his HIV medication was working.
Two of the females were repeat donors, and 1 was a first-timer. The first-time donor did not acknowledge any MSM risk factors. One of the repeat donors had a sexual relationship with a bisexual male who was HIV-positive. The other repeat donor had multiple sexual partners, 1 of whom was known to be hepatitis C-positive.
Future policy change
Dr O’Brien noted that the LGBTQ community in Canada has advocated abolishing the deferral period for MSM blood donors or changing to a risk-based policy that would allow more individuals with MSM history to donate blood.
She said the combined blood services in Canada—Canadian Blood Services and Héma-Québec—are now considering a 12-month deferral period for individuals with MSM history.
“We’re pretty sure we’re going to go ahead,” she noted.
However, the groups must submit a policy request to Health Canada, which will ultimately make the decision.
*Data in the abstract differ from data presented at the meeting.
Photo by Charles Haymond
ANAHEIM, CA—Lifting the lifetime ban on blood donations from men who have sex with men (MSM) has not altered the safety of the blood supply in Canada, according to a new study.
The study showed no increase in the rate of HIV-positive blood donations since Canada changed its policy regarding MSM blood donors, allowing MSM to donate if they have not had sexual contact with another man in the last 5 years.
Because of this finding, Canada may shorten the deferral period for MSM blood donors to 1 year, according to Sheila F. O’Brien, PhD, of Canadian Blood Services in Ottawa, Ontario, Canada.
Dr O’Brien mentioned this possibility and presented data from the study at the 2015 AABB Annual Meeting (abstract S35-030E*).
Prior to 2013, MSM in Canada were not allowed to donate blood if they had any sexual contact with another male since 1977. Females were barred from donating if, in the last year, they had sexual contact with a man who had sex with another man after 1977.
On July 22, 2013, Canada changed this policy so that MSM can donate blood if they have abstained from sexual contact with another man for the past 5 years. The deferral period for females is still 12 months if they have had sex with a man who has had sex with another man in the last 5 years, but there is no deferral if the man had sex with another man more than 5 years before.
To evaluate the impact of this policy change, Dr O’Brien and her colleagues assessed compliance with the MSM criteria before and after the change, as well as the number of HIV-positive blood donations before and after the change.
The researchers also assessed the number of donors who would have been deferred according to the old MSM criteria but donated blood under the new criteria.
MSM history
The researchers selected random male donors of whole blood each month from October 2012 to February 2013 (pre-change) and from October 2014 to February 2015 (post-change). These donors were invited to complete an anonymous online survey about their MSM history.
The survey was completed by 9669 donors before the policy change and 6881 donors after the change. There were 77 donors with MSM history before the change (20% first-time donors, 80% repeat) and 75 donors with MSM history after the change (22% first-time, 78% repeat).
Compliance with policy
After the change in policy for MSM blood donors, there was no significant change in the proportion of donors who had recent MSM history but donated anyway (non-compliant). Before the change, 0.37% of blood donors had an MSM partner in the last 5 years, compared to 0.43% after the change (P=0.54).
However, there was a significant change in the proportion of blood donors with MSM history further in the past. Before the MSM policy change, 0.42% of donors had an MSM partner but not in the last 5 years, compared to 0.66% of donors after the change (P=0.04).
“So we have an improvement in compliance, but it’s mainly because the donors are no longer deferrable,” Dr O’Brien explained.
“Donating while ineligible because of MSM history is actually quite rare, and the percentage of donors with MSM history in the last 5 years did not change when we changed the criteria. But we did see a modest increase in newly eligible MSM, so those that had more than 5 years since their last male-to-male sex.”
In all, there were 112 donors who were newly eligible due to the policy change and did, in fact, donate blood between July 22, 2013 and July 21, 2015. Five of these donors were females who had sexual contact with MSMs.
There were 70 “reinstated” donors in the first year after the policy change and 42 in the second year.
HIV-positive donations
The researchers monitored HIV rates in all blood donations from January 2010 to March 2015.
The rates of HIV-positive donations were as follows: 0.20 for 2010 (2/989,916), 0.50 for 2011 (5/995,122), 0.51 for 2012 (5/987,527), 0 (0/525,337) from January 1, 2013 to July 21, 2013 (before the policy change), 0.54 from July 22, 2013 to July 21, 2014 (5/929,656), and 0.22 from July 22, 2014 to July 21, 2015 (2/893,513).
“So absolutely no change in HIV rate following implementation of our 5-year deferral,” Dr O’Brien said.
In all, there were 7 HIV-positive donations after the policy change. Four were from male donors, and 3 were from females.
Three of the male donors (2 first-time donors, 1 repeat) denied having MSM risk factors, and 1 first-time male donor was aware he was HIV-positive at the time of donation. This man said he donated to determine if his HIV medication was working.
Two of the females were repeat donors, and 1 was a first-timer. The first-time donor did not acknowledge any MSM risk factors. One of the repeat donors had a sexual relationship with a bisexual male who was HIV-positive. The other repeat donor had multiple sexual partners, 1 of whom was known to be hepatitis C-positive.
Future policy change
Dr O’Brien noted that the LGBTQ community in Canada has advocated abolishing the deferral period for MSM blood donors or changing to a risk-based policy that would allow more individuals with MSM history to donate blood.
She said the combined blood services in Canada—Canadian Blood Services and Héma-Québec—are now considering a 12-month deferral period for individuals with MSM history.
“We’re pretty sure we’re going to go ahead,” she noted.
However, the groups must submit a policy request to Health Canada, which will ultimately make the decision.
*Data in the abstract differ from data presented at the meeting.
A better FLT3 inhibitor for AML?
© ASCO/Scott Morgan
CHICAGO—A dual inhibitor of FLT3 and Axl may produce more durable responses than other FLT3 inhibitors and improve survival in patients with FLT3-positive, relapsed or refractory acute myeloid leukemia (AML), according to a speaker at the 2015 ASCO Annual Meeting.
The FLT3/Axl inhibitor, ASP2215, has not been compared against other FLT3 inhibitors directly, and the data presented were from a phase 1/2 study.
However, the speaker said ASP2215 provided “potent and sustained” inhibition of FLT3 and produced an overall response rate (ORR) of 52% among FLT3-positive patients.
The median duration of response for these patients was 18 weeks, and their median overall survival was about 27 weeks.
Mark J. Levis, MD, PhD, of the Sidney Kimmel Comprehensive Cancer Center at Johns Hopkins in Baltimore, Maryland, presented these results at ASCO as abstract 7003.*
“We’ve been studying FLT3 inhibitors for a number of years now,” Dr Levis began, “and we think they show significant clinical promise, [but] they also have problems.”
He noted that some of these drugs haven’t been particularly safe or well-tolerated. They can cause gastrointestinal toxicity, hand-foot syndrome, QT prolongation, and myelosuppression.
However, the most intriguing problem with FLT3 inhibitors, according to Dr Levis, is the emergence of resistance-conferring point mutations observed in studies of some of the newer drugs, such as sorafenib and quizartinib.
“So in that context, here we have ASP2215,” Dr Levis said. “This is a type 1 FLT3 tyrosine kinase inhibitor, and, as such, it has activity against not only wild-type and ITD-mutated FLT3 but also against those resistance-conferring point mutations typically found in the activation loop at the so-called gatekeeper residue (F691L).”
With this in mind, he and his colleagues conducted a phase 1/2 trial of ASP2215. The study was sponsored by Astellas Pharma Global Development, Inc., the company developing ASP2215.
The trial was open to patients with relapsed or refractory AML, irrespective of their FLT3 mutation status. The researchers’ goal was to identify a safe, tolerable dose of ASP2215 that fully inhibited FLT3.
The team used a standard 3+3 design, with dose levels ranging from 20 mg to 450 mg once daily. They expanded every cohort until they reached a dose-limiting toxicity.
In all, the trial enrolled 198 patients, 24 in the dose-escalation cohorts and 174 in the dose-expansion cohorts. The patients’ median age was 62 (range, 21-90), and 53.1% were male.
Nearly 66% of patients had FLT3 mutations, 29.4% were FLT3-negative, and 5.2% had unknown FLT3 status. About 35% of patients had received 1 prior line of therapy, 26.3% had 2, 33.5% had 3 or more, and 5.7% had an unknown number of prior therapies.
Safety results
In the 194 patients who were evaluable for safety, treatment-emergent adverse events included diarrhea (13.4%), fatigue (12.4%), AST increase (11.3%), ALT increase (9.3%), thrombocytopenia (7.7%), anemia (7.2%), peripheral edema (7.2%), constipation (6.7%), nausea (6.7%), dizziness (6.2%), vomiting (5.7%), and dysgeusia (5.2%).
Serious adverse events included febrile neutropenia (27.3%), sepsis (11.9%), disease progression (10.3%), pneumonia (8.8%), hypotension (5.7%), and respiratory failure (5.7%).
“The kinds of side effects we saw were typical for a relapsed/refractory AML population,” Dr Levis said. “Nothing really stood out. Any trial of relapsed/refractory AML is going to have febrile neutropenia and sepsis, and those were our dominant, serious adverse events. There was no real safety signal here that was unique to the drug, we felt.”
The researchers said the maximum-tolerated dose of ASP2215 was 300 mg, as 2 patients who received the 450 mg dose experienced dose-limiting toxicities. One was grade 3 diarrhea, and the other was grade 3 AST elevation.
Response and survival
Among the 127 patients who were FLT3-positive, the ORR was 52% (n=66). The complete response (CR) rate was 6.3% (n=8). The composite CR rate, which includes CRs with incomplete hematologic recovery (CRi) and incomplete platelet recovery (CRp), was 40.9% (n=52). And the partial response (PR) rate was 11% (n=14).
“As we scale up the dose, the PRs shift on over to CRis, and the dominant response is, in fact, a complete response with incomplete count recovery,” Dr Levis said. “The categories where we had the largest number of responses were the 120 mg and 200 mg categories. We didn’t really have enough patients in the 300 mg category to comment on it. ”
For the FLT3-positive patients, the median duration of response was 126 days.
“The duration of response really stood out here,” Dr Levis said. “It’s over 4 months. That is something we really didn’t see with the other drugs, and I suspect that is a reflection of the suppression of the outgrowth of these resistance mutations.”
Unfortunately, FLT3-wild-type patients did not fare as well. The ORR among these patients was 8.8%. None of the patients achieved a CR, 3 had a composite CR (5.3%), and 2 had a PR (3.5%).
Among the FLT3-positive patients, the median overall survival was about 27 weeks. It was 128 days in the 20 mg dose cohort (n=13), 105.5 days in the 40 mg cohort (n=8), 201 days in the 80 mg cohort (n=12), 199 days in the 120 mg cohort (n=40), and 161 days in the 200 mg cohort (n=45). Dr Levis did not present survival data for the 300 mg or 450 mg cohorts, which included 7 and 2 patients, respectively.
“[Relapsed/refractory AML] is a population that has a median survival of about 3 months with conventional therapy, at least by historical publications,” Dr Levis noted. “If you look at survival in this trial, patients treated at the FLT3-inhibitory doses [had a] greater than 6-month median survival.”
He added that studies of ASP2215 in combination with other agents are ongoing in patients with newly diagnosed AML. And phase 3 trials of ASP2215 at the 120 mg dose, with the option of scaling up to 200 mg, are planned.
*Information in the abstract differs from that presented at the meeting.
© ASCO/Scott Morgan
CHICAGO—A dual inhibitor of FLT3 and Axl may produce more durable responses than other FLT3 inhibitors and improve survival in patients with FLT3-positive, relapsed or refractory acute myeloid leukemia (AML), according to a speaker at the 2015 ASCO Annual Meeting.
The FLT3/Axl inhibitor, ASP2215, has not been compared against other FLT3 inhibitors directly, and the data presented were from a phase 1/2 study.
However, the speaker said ASP2215 provided “potent and sustained” inhibition of FLT3 and produced an overall response rate (ORR) of 52% among FLT3-positive patients.
The median duration of response for these patients was 18 weeks, and their median overall survival was about 27 weeks.
Mark J. Levis, MD, PhD, of the Sidney Kimmel Comprehensive Cancer Center at Johns Hopkins in Baltimore, Maryland, presented these results at ASCO as abstract 7003.*
“We’ve been studying FLT3 inhibitors for a number of years now,” Dr Levis began, “and we think they show significant clinical promise, [but] they also have problems.”
He noted that some of these drugs haven’t been particularly safe or well-tolerated. They can cause gastrointestinal toxicity, hand-foot syndrome, QT prolongation, and myelosuppression.
However, the most intriguing problem with FLT3 inhibitors, according to Dr Levis, is the emergence of resistance-conferring point mutations observed in studies of some of the newer drugs, such as sorafenib and quizartinib.
“So in that context, here we have ASP2215,” Dr Levis said. “This is a type 1 FLT3 tyrosine kinase inhibitor, and, as such, it has activity against not only wild-type and ITD-mutated FLT3 but also against those resistance-conferring point mutations typically found in the activation loop at the so-called gatekeeper residue (F691L).”
With this in mind, he and his colleagues conducted a phase 1/2 trial of ASP2215. The study was sponsored by Astellas Pharma Global Development, Inc., the company developing ASP2215.
The trial was open to patients with relapsed or refractory AML, irrespective of their FLT3 mutation status. The researchers’ goal was to identify a safe, tolerable dose of ASP2215 that fully inhibited FLT3.
The team used a standard 3+3 design, with dose levels ranging from 20 mg to 450 mg once daily. They expanded every cohort until they reached a dose-limiting toxicity.
In all, the trial enrolled 198 patients, 24 in the dose-escalation cohorts and 174 in the dose-expansion cohorts. The patients’ median age was 62 (range, 21-90), and 53.1% were male.
Nearly 66% of patients had FLT3 mutations, 29.4% were FLT3-negative, and 5.2% had unknown FLT3 status. About 35% of patients had received 1 prior line of therapy, 26.3% had 2, 33.5% had 3 or more, and 5.7% had an unknown number of prior therapies.
Safety results
In the 194 patients who were evaluable for safety, treatment-emergent adverse events included diarrhea (13.4%), fatigue (12.4%), AST increase (11.3%), ALT increase (9.3%), thrombocytopenia (7.7%), anemia (7.2%), peripheral edema (7.2%), constipation (6.7%), nausea (6.7%), dizziness (6.2%), vomiting (5.7%), and dysgeusia (5.2%).
Serious adverse events included febrile neutropenia (27.3%), sepsis (11.9%), disease progression (10.3%), pneumonia (8.8%), hypotension (5.7%), and respiratory failure (5.7%).
“The kinds of side effects we saw were typical for a relapsed/refractory AML population,” Dr Levis said. “Nothing really stood out. Any trial of relapsed/refractory AML is going to have febrile neutropenia and sepsis, and those were our dominant, serious adverse events. There was no real safety signal here that was unique to the drug, we felt.”
The researchers said the maximum-tolerated dose of ASP2215 was 300 mg, as 2 patients who received the 450 mg dose experienced dose-limiting toxicities. One was grade 3 diarrhea, and the other was grade 3 AST elevation.
Response and survival
Among the 127 patients who were FLT3-positive, the ORR was 52% (n=66). The complete response (CR) rate was 6.3% (n=8). The composite CR rate, which includes CRs with incomplete hematologic recovery (CRi) and incomplete platelet recovery (CRp), was 40.9% (n=52). And the partial response (PR) rate was 11% (n=14).
“As we scale up the dose, the PRs shift on over to CRis, and the dominant response is, in fact, a complete response with incomplete count recovery,” Dr Levis said. “The categories where we had the largest number of responses were the 120 mg and 200 mg categories. We didn’t really have enough patients in the 300 mg category to comment on it. ”
For the FLT3-positive patients, the median duration of response was 126 days.
“The duration of response really stood out here,” Dr Levis said. “It’s over 4 months. That is something we really didn’t see with the other drugs, and I suspect that is a reflection of the suppression of the outgrowth of these resistance mutations.”
Unfortunately, FLT3-wild-type patients did not fare as well. The ORR among these patients was 8.8%. None of the patients achieved a CR, 3 had a composite CR (5.3%), and 2 had a PR (3.5%).
Among the FLT3-positive patients, the median overall survival was about 27 weeks. It was 128 days in the 20 mg dose cohort (n=13), 105.5 days in the 40 mg cohort (n=8), 201 days in the 80 mg cohort (n=12), 199 days in the 120 mg cohort (n=40), and 161 days in the 200 mg cohort (n=45). Dr Levis did not present survival data for the 300 mg or 450 mg cohorts, which included 7 and 2 patients, respectively.
“[Relapsed/refractory AML] is a population that has a median survival of about 3 months with conventional therapy, at least by historical publications,” Dr Levis noted. “If you look at survival in this trial, patients treated at the FLT3-inhibitory doses [had a] greater than 6-month median survival.”
He added that studies of ASP2215 in combination with other agents are ongoing in patients with newly diagnosed AML. And phase 3 trials of ASP2215 at the 120 mg dose, with the option of scaling up to 200 mg, are planned.
*Information in the abstract differs from that presented at the meeting.
© ASCO/Scott Morgan
CHICAGO—A dual inhibitor of FLT3 and Axl may produce more durable responses than other FLT3 inhibitors and improve survival in patients with FLT3-positive, relapsed or refractory acute myeloid leukemia (AML), according to a speaker at the 2015 ASCO Annual Meeting.
The FLT3/Axl inhibitor, ASP2215, has not been compared against other FLT3 inhibitors directly, and the data presented were from a phase 1/2 study.
However, the speaker said ASP2215 provided “potent and sustained” inhibition of FLT3 and produced an overall response rate (ORR) of 52% among FLT3-positive patients.
The median duration of response for these patients was 18 weeks, and their median overall survival was about 27 weeks.
Mark J. Levis, MD, PhD, of the Sidney Kimmel Comprehensive Cancer Center at Johns Hopkins in Baltimore, Maryland, presented these results at ASCO as abstract 7003.*
“We’ve been studying FLT3 inhibitors for a number of years now,” Dr Levis began, “and we think they show significant clinical promise, [but] they also have problems.”
He noted that some of these drugs haven’t been particularly safe or well-tolerated. They can cause gastrointestinal toxicity, hand-foot syndrome, QT prolongation, and myelosuppression.
However, the most intriguing problem with FLT3 inhibitors, according to Dr Levis, is the emergence of resistance-conferring point mutations observed in studies of some of the newer drugs, such as sorafenib and quizartinib.
“So in that context, here we have ASP2215,” Dr Levis said. “This is a type 1 FLT3 tyrosine kinase inhibitor, and, as such, it has activity against not only wild-type and ITD-mutated FLT3 but also against those resistance-conferring point mutations typically found in the activation loop at the so-called gatekeeper residue (F691L).”
With this in mind, he and his colleagues conducted a phase 1/2 trial of ASP2215. The study was sponsored by Astellas Pharma Global Development, Inc., the company developing ASP2215.
The trial was open to patients with relapsed or refractory AML, irrespective of their FLT3 mutation status. The researchers’ goal was to identify a safe, tolerable dose of ASP2215 that fully inhibited FLT3.
The team used a standard 3+3 design, with dose levels ranging from 20 mg to 450 mg once daily. They expanded every cohort until they reached a dose-limiting toxicity.
In all, the trial enrolled 198 patients, 24 in the dose-escalation cohorts and 174 in the dose-expansion cohorts. The patients’ median age was 62 (range, 21-90), and 53.1% were male.
Nearly 66% of patients had FLT3 mutations, 29.4% were FLT3-negative, and 5.2% had unknown FLT3 status. About 35% of patients had received 1 prior line of therapy, 26.3% had 2, 33.5% had 3 or more, and 5.7% had an unknown number of prior therapies.
Safety results
In the 194 patients who were evaluable for safety, treatment-emergent adverse events included diarrhea (13.4%), fatigue (12.4%), AST increase (11.3%), ALT increase (9.3%), thrombocytopenia (7.7%), anemia (7.2%), peripheral edema (7.2%), constipation (6.7%), nausea (6.7%), dizziness (6.2%), vomiting (5.7%), and dysgeusia (5.2%).
Serious adverse events included febrile neutropenia (27.3%), sepsis (11.9%), disease progression (10.3%), pneumonia (8.8%), hypotension (5.7%), and respiratory failure (5.7%).
“The kinds of side effects we saw were typical for a relapsed/refractory AML population,” Dr Levis said. “Nothing really stood out. Any trial of relapsed/refractory AML is going to have febrile neutropenia and sepsis, and those were our dominant, serious adverse events. There was no real safety signal here that was unique to the drug, we felt.”
The researchers said the maximum-tolerated dose of ASP2215 was 300 mg, as 2 patients who received the 450 mg dose experienced dose-limiting toxicities. One was grade 3 diarrhea, and the other was grade 3 AST elevation.
Response and survival
Among the 127 patients who were FLT3-positive, the ORR was 52% (n=66). The complete response (CR) rate was 6.3% (n=8). The composite CR rate, which includes CRs with incomplete hematologic recovery (CRi) and incomplete platelet recovery (CRp), was 40.9% (n=52). And the partial response (PR) rate was 11% (n=14).
“As we scale up the dose, the PRs shift on over to CRis, and the dominant response is, in fact, a complete response with incomplete count recovery,” Dr Levis said. “The categories where we had the largest number of responses were the 120 mg and 200 mg categories. We didn’t really have enough patients in the 300 mg category to comment on it. ”
For the FLT3-positive patients, the median duration of response was 126 days.
“The duration of response really stood out here,” Dr Levis said. “It’s over 4 months. That is something we really didn’t see with the other drugs, and I suspect that is a reflection of the suppression of the outgrowth of these resistance mutations.”
Unfortunately, FLT3-wild-type patients did not fare as well. The ORR among these patients was 8.8%. None of the patients achieved a CR, 3 had a composite CR (5.3%), and 2 had a PR (3.5%).
Among the FLT3-positive patients, the median overall survival was about 27 weeks. It was 128 days in the 20 mg dose cohort (n=13), 105.5 days in the 40 mg cohort (n=8), 201 days in the 80 mg cohort (n=12), 199 days in the 120 mg cohort (n=40), and 161 days in the 200 mg cohort (n=45). Dr Levis did not present survival data for the 300 mg or 450 mg cohorts, which included 7 and 2 patients, respectively.
“[Relapsed/refractory AML] is a population that has a median survival of about 3 months with conventional therapy, at least by historical publications,” Dr Levis noted. “If you look at survival in this trial, patients treated at the FLT3-inhibitory doses [had a] greater than 6-month median survival.”
He added that studies of ASP2215 in combination with other agents are ongoing in patients with newly diagnosed AML. And phase 3 trials of ASP2215 at the 120 mg dose, with the option of scaling up to 200 mg, are planned.
*Information in the abstract differs from that presented at the meeting.
CAR produces high CR rate in adults with rel/ref ALL
the 2015 ASCO Annual Meeting
© ASCO/Max Gersh
CHICAGO—A CD19-targeted chimeric antigen receptor (CAR) T-cell therapy can provide durable complete responses (CRs) or a bridge to allogeneic transplant in adults with relapsed or refractory acute lymphoblastic leukemia (ALL), updated results of a phase 1 study suggest.
The therapy, JCAR015, produced a CR rate of 87%, and 33% of these patients went on to transplant.
The median duration of response or relapse-free survival was 5.3 months. The median overall survival was 8.5 months.
Nearly a quarter of patients developed severe cytokine release syndrome (CRS), and nearly 30% experienced neurological toxicities. But researchers said these effects were largely treatable and reversible.
This study was temporarily placed on clinical hold last year, after 2 patients died from complications related to CRS. But the hold was soon lifted and enrollment and dosing criteria were changed in an attempt to prevent severe CRS.
Jae H. Park, MD, of Memorial Sloan Kettering Cancer Center in New York, presented updated results of this trial (NCT01044069) at the 2015 ASCO Annual Meeting (abstract 7010*). The study is sponsored by Memorial Sloan Kettering, but funding has also been provided by Juno Therapeutics, the company developing JCAR015.
Results from this trial have previously been reported in Science Translational Medicine (Davila et al 2014; Brentjens et al 2013), at AACR 2014, and at ASH 2014.
At ASCO, Dr Park presented results in 39 patients with relapsed/refractory, CD19+ ALL. All of them were evaluable for toxicity assessment, and 38 were evaluable for response with at least 1 month of follow-up.
There were 29 males, and the patients’ median age was 45 (range, 22-74). Thirty-three percent had Ph+ ALL, and 11% had the T315I mutation.
Forty-nine percent of patients had received 2 prior therapies, 23% had received 3, and 28% had received 4 or more. Thirty-six percent of patients had a prior allogeneic hematopoietic stem cell transplant (HSCT).
For this study, patients first underwent leukapheresis. While their T cells were being manufactured, they were allowed to receive salvage chemotherapy. Patients underwent repeat bone marrow biopsy to assess their disease status immediately prior to T-cell infusion.
Fifty-four percent of patients (n=21) had morphologic disease (>5% blasts in the bone marrow, median 52%) immediately prior to JCAR015 infusion, and the remaining patients (n=18) had minimal residual disease (MRD).
Two days after conditioning with cyclophosphamide, patients received an infusion of 1-3 x 106 CAR T cells/kg. At day 28, the researchers assessed patients’ disease with a repeat bone marrow biopsy.
Treatment results
The median follow-up was 5.6 months (1 to >38 months). Six patients had more than a year of follow-up.
The CR rate was 87% (33/38), and 81% of evaluable patients (26/32) were MRD-negative. The median time to CR was 23 days, and the median duration of response or relapse-free survival was 5.3 months.
“We examined the CR rates by different subgroup,” Dr Park noted. “We looked at whether patients had a pre-T-cell disease burden: morphologic disease vs minimal residual disease, whether they had an allogeneic bone marrow transplant prior to CAR T-cell infusion, their Ph+ status, age at infusion, and prior lines of therapy. And there was no [significant] difference between these groups for CRs and MRD-negative CR rate.”
At the time of presentation, 14 patients were disease-free, 10 of whom had not gone on to HSCT. In all, 11 patients went on to allogeneic HSCT.
Fourteen patients relapsed during follow-up, 3 after HSCT. Two of these patients had CD19-negative bone marrow blasts.
The median overall survival was 8.5 months in all patients and 10.8 months in patients who were MRD-negative. The median overall survival was 9.9 months in patients who underwent allogeneic HSCT and 8.5 months in patients who did not.
Dr Park said key adverse events were CRS—clinically manifested by fever, hypotension, and respiratory insufficiency—and neurological changes such as delirium, global encephalopathy, aphasia, and seizures.
Twenty-three percent of patients (n=9) developed severe CRS, 28% (n=11) had grade 3/4 neurotoxicity, and 8% (n=3) had grade 5 toxicity. The patients with grade 5 toxicities died of ventricular arrhythmia, sepsis, and an unknown cause (although this patient suffered a seizure).
The severity of CRS correlated with disease burden, and CRS was managed with an IL-6R inhibitor (n=4), a steroid (n=2), or both (n=9). Neurological symptoms were reversible and could occur independently of CRS, Dr Park said.
*Information in the abstract differs from that presented at the meeting.
the 2015 ASCO Annual Meeting
© ASCO/Max Gersh
CHICAGO—A CD19-targeted chimeric antigen receptor (CAR) T-cell therapy can provide durable complete responses (CRs) or a bridge to allogeneic transplant in adults with relapsed or refractory acute lymphoblastic leukemia (ALL), updated results of a phase 1 study suggest.
The therapy, JCAR015, produced a CR rate of 87%, and 33% of these patients went on to transplant.
The median duration of response or relapse-free survival was 5.3 months. The median overall survival was 8.5 months.
Nearly a quarter of patients developed severe cytokine release syndrome (CRS), and nearly 30% experienced neurological toxicities. But researchers said these effects were largely treatable and reversible.
This study was temporarily placed on clinical hold last year, after 2 patients died from complications related to CRS. But the hold was soon lifted and enrollment and dosing criteria were changed in an attempt to prevent severe CRS.
Jae H. Park, MD, of Memorial Sloan Kettering Cancer Center in New York, presented updated results of this trial (NCT01044069) at the 2015 ASCO Annual Meeting (abstract 7010*). The study is sponsored by Memorial Sloan Kettering, but funding has also been provided by Juno Therapeutics, the company developing JCAR015.
Results from this trial have previously been reported in Science Translational Medicine (Davila et al 2014; Brentjens et al 2013), at AACR 2014, and at ASH 2014.
At ASCO, Dr Park presented results in 39 patients with relapsed/refractory, CD19+ ALL. All of them were evaluable for toxicity assessment, and 38 were evaluable for response with at least 1 month of follow-up.
There were 29 males, and the patients’ median age was 45 (range, 22-74). Thirty-three percent had Ph+ ALL, and 11% had the T315I mutation.
Forty-nine percent of patients had received 2 prior therapies, 23% had received 3, and 28% had received 4 or more. Thirty-six percent of patients had a prior allogeneic hematopoietic stem cell transplant (HSCT).
For this study, patients first underwent leukapheresis. While their T cells were being manufactured, they were allowed to receive salvage chemotherapy. Patients underwent repeat bone marrow biopsy to assess their disease status immediately prior to T-cell infusion.
Fifty-four percent of patients (n=21) had morphologic disease (>5% blasts in the bone marrow, median 52%) immediately prior to JCAR015 infusion, and the remaining patients (n=18) had minimal residual disease (MRD).
Two days after conditioning with cyclophosphamide, patients received an infusion of 1-3 x 106 CAR T cells/kg. At day 28, the researchers assessed patients’ disease with a repeat bone marrow biopsy.
Treatment results
The median follow-up was 5.6 months (1 to >38 months). Six patients had more than a year of follow-up.
The CR rate was 87% (33/38), and 81% of evaluable patients (26/32) were MRD-negative. The median time to CR was 23 days, and the median duration of response or relapse-free survival was 5.3 months.
“We examined the CR rates by different subgroup,” Dr Park noted. “We looked at whether patients had a pre-T-cell disease burden: morphologic disease vs minimal residual disease, whether they had an allogeneic bone marrow transplant prior to CAR T-cell infusion, their Ph+ status, age at infusion, and prior lines of therapy. And there was no [significant] difference between these groups for CRs and MRD-negative CR rate.”
At the time of presentation, 14 patients were disease-free, 10 of whom had not gone on to HSCT. In all, 11 patients went on to allogeneic HSCT.
Fourteen patients relapsed during follow-up, 3 after HSCT. Two of these patients had CD19-negative bone marrow blasts.
The median overall survival was 8.5 months in all patients and 10.8 months in patients who were MRD-negative. The median overall survival was 9.9 months in patients who underwent allogeneic HSCT and 8.5 months in patients who did not.
Dr Park said key adverse events were CRS—clinically manifested by fever, hypotension, and respiratory insufficiency—and neurological changes such as delirium, global encephalopathy, aphasia, and seizures.
Twenty-three percent of patients (n=9) developed severe CRS, 28% (n=11) had grade 3/4 neurotoxicity, and 8% (n=3) had grade 5 toxicity. The patients with grade 5 toxicities died of ventricular arrhythmia, sepsis, and an unknown cause (although this patient suffered a seizure).
The severity of CRS correlated with disease burden, and CRS was managed with an IL-6R inhibitor (n=4), a steroid (n=2), or both (n=9). Neurological symptoms were reversible and could occur independently of CRS, Dr Park said.
*Information in the abstract differs from that presented at the meeting.
the 2015 ASCO Annual Meeting
© ASCO/Max Gersh
CHICAGO—A CD19-targeted chimeric antigen receptor (CAR) T-cell therapy can provide durable complete responses (CRs) or a bridge to allogeneic transplant in adults with relapsed or refractory acute lymphoblastic leukemia (ALL), updated results of a phase 1 study suggest.
The therapy, JCAR015, produced a CR rate of 87%, and 33% of these patients went on to transplant.
The median duration of response or relapse-free survival was 5.3 months. The median overall survival was 8.5 months.
Nearly a quarter of patients developed severe cytokine release syndrome (CRS), and nearly 30% experienced neurological toxicities. But researchers said these effects were largely treatable and reversible.
This study was temporarily placed on clinical hold last year, after 2 patients died from complications related to CRS. But the hold was soon lifted and enrollment and dosing criteria were changed in an attempt to prevent severe CRS.
Jae H. Park, MD, of Memorial Sloan Kettering Cancer Center in New York, presented updated results of this trial (NCT01044069) at the 2015 ASCO Annual Meeting (abstract 7010*). The study is sponsored by Memorial Sloan Kettering, but funding has also been provided by Juno Therapeutics, the company developing JCAR015.
Results from this trial have previously been reported in Science Translational Medicine (Davila et al 2014; Brentjens et al 2013), at AACR 2014, and at ASH 2014.
At ASCO, Dr Park presented results in 39 patients with relapsed/refractory, CD19+ ALL. All of them were evaluable for toxicity assessment, and 38 were evaluable for response with at least 1 month of follow-up.
There were 29 males, and the patients’ median age was 45 (range, 22-74). Thirty-three percent had Ph+ ALL, and 11% had the T315I mutation.
Forty-nine percent of patients had received 2 prior therapies, 23% had received 3, and 28% had received 4 or more. Thirty-six percent of patients had a prior allogeneic hematopoietic stem cell transplant (HSCT).
For this study, patients first underwent leukapheresis. While their T cells were being manufactured, they were allowed to receive salvage chemotherapy. Patients underwent repeat bone marrow biopsy to assess their disease status immediately prior to T-cell infusion.
Fifty-four percent of patients (n=21) had morphologic disease (>5% blasts in the bone marrow, median 52%) immediately prior to JCAR015 infusion, and the remaining patients (n=18) had minimal residual disease (MRD).
Two days after conditioning with cyclophosphamide, patients received an infusion of 1-3 x 106 CAR T cells/kg. At day 28, the researchers assessed patients’ disease with a repeat bone marrow biopsy.
Treatment results
The median follow-up was 5.6 months (1 to >38 months). Six patients had more than a year of follow-up.
The CR rate was 87% (33/38), and 81% of evaluable patients (26/32) were MRD-negative. The median time to CR was 23 days, and the median duration of response or relapse-free survival was 5.3 months.
“We examined the CR rates by different subgroup,” Dr Park noted. “We looked at whether patients had a pre-T-cell disease burden: morphologic disease vs minimal residual disease, whether they had an allogeneic bone marrow transplant prior to CAR T-cell infusion, their Ph+ status, age at infusion, and prior lines of therapy. And there was no [significant] difference between these groups for CRs and MRD-negative CR rate.”
At the time of presentation, 14 patients were disease-free, 10 of whom had not gone on to HSCT. In all, 11 patients went on to allogeneic HSCT.
Fourteen patients relapsed during follow-up, 3 after HSCT. Two of these patients had CD19-negative bone marrow blasts.
The median overall survival was 8.5 months in all patients and 10.8 months in patients who were MRD-negative. The median overall survival was 9.9 months in patients who underwent allogeneic HSCT and 8.5 months in patients who did not.
Dr Park said key adverse events were CRS—clinically manifested by fever, hypotension, and respiratory insufficiency—and neurological changes such as delirium, global encephalopathy, aphasia, and seizures.
Twenty-three percent of patients (n=9) developed severe CRS, 28% (n=11) had grade 3/4 neurotoxicity, and 8% (n=3) had grade 5 toxicity. The patients with grade 5 toxicities died of ventricular arrhythmia, sepsis, and an unknown cause (although this patient suffered a seizure).
The severity of CRS correlated with disease burden, and CRS was managed with an IL-6R inhibitor (n=4), a steroid (n=2), or both (n=9). Neurological symptoms were reversible and could occur independently of CRS, Dr Park said.
*Information in the abstract differs from that presented at the meeting.
No survival difference with allo- or auto-SCT in PTCL
© ASCO/Max Gersh
CHICAGO—Allogeneic and autologous transplants produce similar survival rates when used as first-line therapy in younger patients with peripheral
T-cell lymphoma (PTCL), according to interim results of the AATT trial.
The study also showed that deaths among patients who received autologous stem cell transplants (auto-SCTs) were a result of relapse and salvage treatment, while deaths among allogeneic SCT (allo-SCT) recipients were transplant-related.
Norbert Schmitz, MD, PhD, of Asklepios Hospital St. Georg in Hamburg, Germany, presented these findings at the 2015 ASCO Annual Meeting (abstract 8507*).
Dr Schmitz noted that only previous study comparing auto-SCT with allo-SCT as first-line therapy in PTCL was not designed or powered to evaluate the differences between the transplant types.
So he and his colleagues conducted the AATT trial to determine the differences. The team hypothesized that allo-SCT would improve 3-year event-free survival from 35% to 60%, given an α of 5% and a power of 80%. They needed 140 patients to prove or disprove this theory.
Ultimately, the investigators enrolled 104 patients and performed an interim analysis when 58 patients were evaluable for response.
Of the 58 patients, 30 were randomized to the auto-SCT arm and 28 to the allo-SCT arm. Baseline characteristics were similar between the arms, including patients’ median ages (49 and 50, respectively), the proportion of patients with stage III/IV disease (87% and 93%), and the proportion with ECOG status greater than 1 (23% and 18%).
Most patients in both arms had PTCL not otherwise specified (36% in the auto-SCT arm and 50% in the allo-SCT arm). Other subtypes included angioimmunoblastic T-cell lymphoma (23% and 32%, respectively), ALK-negative anaplastic large-cell lymphoma (20% and 4%), and “other” PTCLs (20% and 8%). The other PTCLs were NK/T-cell lymphoma, intestinal T/NK-cell lymphoma, hepatosplenic γδ lymphoma, and subcutaneous panniculitis-like PTCL.
Treatment characteristics
Before undergoing transplant, patients in both arms received treatment with CHOEP (cyclophosphamide, doxorubicin, etoposide, vincristine, and prednisone) on days 1, 15, 29, and 43. If they experienced a complete response (CR), partial response, or no change, patients received DHAP (dexamethasone, cytarabine, and cisplatin) on day 64.
Patients in the auto-SCT arm received BEAM (carmustine, etoposide, cytarabine, and melphalan) prior to transplant. And patients in the allo-SCT arm received FBC (fludarabine, busulfan, and cyclophosphamide).
Overall, 36 patients (62%) completed treatment per protocol, 19 in the auto-SCT arm and 17 in the allo-SCT arm. Thirty-eight percent of all patients could not proceed to transplant per protocol, mostly because of early lymphoma progression.
Response and survival
The researchers observed CRs/unconfirmed CRs (CRus) in 33% (n=10) of patients in the auto-SCT arm and 39% (n=11) in the allo-SCT arm. CR/CRus and progressive disease within 2 months occurred in 3% (n=1) and 4% (n=1) of patients, respectively.
Partial responses were seen in 17% (n=5) of patients in the auto-SCT arm and 7% (n=2) in the allo-SCT arm. There was no change in 7% (n=2) and 0% of patients, respectively. And responses were unknown in 7% (n=2) of patients in the auto-SCT arm.
Progressive disease occurred in 33% (n=10) of patients in the auto-SCT arm and 36% (n=10) in the allo-SCT arm. And treatment-related death occurred in 0% (n=0) and 14% (n=4), respectively.
At the interim analysis, there was no significant difference between the treatment arms with regard to event-free survival (P=0.963) or overall survival (P=0.174).
“At that time, the decision was made to stop the study,” Dr Schmitz said.
He explained that a conditional power analysis showed a low probability that the primary endpoint—a 25% improvement in event-free survival with allo-SCT—could still be met. So the data safety monitoring board decided to stop enrollment.
An updated analysis, performed at a median observation time of 26 months, showed there was still no significant difference in overall survival between the treatment arms (P=0.362).
Cause of death
In the intent-to-treat population—30 patients in the auto-SCT arm and 28 in the allo-SCT arm—there were 16 lymphoma-related deaths, 10 in the auto-SCT arm and 6 in the allo-SCT arm.
There were 6 deaths related to study treatment (4 early and 2 late), all in the allo-SCT arm. One patient in the allo-SCT arm died of post-transplant lymphoproliferative disorder, and 1 patient in the same arm died of hemorrhage after salvage. One patient in each arm died as a result of salvage treatment.
Dr Schmitz and his colleagues also looked at the cause of death among patients who received a transplant—19 in the auto-SCT arm and 17 in the allo-SCT arm.
After SCT, there were 7 deaths in each arm. In the auto-SCT arm, there were 6 lymphoma-related deaths and 1 death related to salvage treatment. In the allo-SCT arm, there were 7 cases of non-relapse-related mortality, including 1 patient with post-transplant lymphoproliferative disorder.
“There certainly seems to be a [graft-vs-lymphoma] effect of allo-transplant in T-cell lymphoma that is, unfortunately, in some way, counterbalanced by high transplant-related mortality,” Dr Schmitz said.
He added that results of a final analysis of the 104 patients enrolled on this study should be available in 2017.
*Information in the abstract differs from that presented at the meeting.
© ASCO/Max Gersh
CHICAGO—Allogeneic and autologous transplants produce similar survival rates when used as first-line therapy in younger patients with peripheral
T-cell lymphoma (PTCL), according to interim results of the AATT trial.
The study also showed that deaths among patients who received autologous stem cell transplants (auto-SCTs) were a result of relapse and salvage treatment, while deaths among allogeneic SCT (allo-SCT) recipients were transplant-related.
Norbert Schmitz, MD, PhD, of Asklepios Hospital St. Georg in Hamburg, Germany, presented these findings at the 2015 ASCO Annual Meeting (abstract 8507*).
Dr Schmitz noted that only previous study comparing auto-SCT with allo-SCT as first-line therapy in PTCL was not designed or powered to evaluate the differences between the transplant types.
So he and his colleagues conducted the AATT trial to determine the differences. The team hypothesized that allo-SCT would improve 3-year event-free survival from 35% to 60%, given an α of 5% and a power of 80%. They needed 140 patients to prove or disprove this theory.
Ultimately, the investigators enrolled 104 patients and performed an interim analysis when 58 patients were evaluable for response.
Of the 58 patients, 30 were randomized to the auto-SCT arm and 28 to the allo-SCT arm. Baseline characteristics were similar between the arms, including patients’ median ages (49 and 50, respectively), the proportion of patients with stage III/IV disease (87% and 93%), and the proportion with ECOG status greater than 1 (23% and 18%).
Most patients in both arms had PTCL not otherwise specified (36% in the auto-SCT arm and 50% in the allo-SCT arm). Other subtypes included angioimmunoblastic T-cell lymphoma (23% and 32%, respectively), ALK-negative anaplastic large-cell lymphoma (20% and 4%), and “other” PTCLs (20% and 8%). The other PTCLs were NK/T-cell lymphoma, intestinal T/NK-cell lymphoma, hepatosplenic γδ lymphoma, and subcutaneous panniculitis-like PTCL.
Treatment characteristics
Before undergoing transplant, patients in both arms received treatment with CHOEP (cyclophosphamide, doxorubicin, etoposide, vincristine, and prednisone) on days 1, 15, 29, and 43. If they experienced a complete response (CR), partial response, or no change, patients received DHAP (dexamethasone, cytarabine, and cisplatin) on day 64.
Patients in the auto-SCT arm received BEAM (carmustine, etoposide, cytarabine, and melphalan) prior to transplant. And patients in the allo-SCT arm received FBC (fludarabine, busulfan, and cyclophosphamide).
Overall, 36 patients (62%) completed treatment per protocol, 19 in the auto-SCT arm and 17 in the allo-SCT arm. Thirty-eight percent of all patients could not proceed to transplant per protocol, mostly because of early lymphoma progression.
Response and survival
The researchers observed CRs/unconfirmed CRs (CRus) in 33% (n=10) of patients in the auto-SCT arm and 39% (n=11) in the allo-SCT arm. CR/CRus and progressive disease within 2 months occurred in 3% (n=1) and 4% (n=1) of patients, respectively.
Partial responses were seen in 17% (n=5) of patients in the auto-SCT arm and 7% (n=2) in the allo-SCT arm. There was no change in 7% (n=2) and 0% of patients, respectively. And responses were unknown in 7% (n=2) of patients in the auto-SCT arm.
Progressive disease occurred in 33% (n=10) of patients in the auto-SCT arm and 36% (n=10) in the allo-SCT arm. And treatment-related death occurred in 0% (n=0) and 14% (n=4), respectively.
At the interim analysis, there was no significant difference between the treatment arms with regard to event-free survival (P=0.963) or overall survival (P=0.174).
“At that time, the decision was made to stop the study,” Dr Schmitz said.
He explained that a conditional power analysis showed a low probability that the primary endpoint—a 25% improvement in event-free survival with allo-SCT—could still be met. So the data safety monitoring board decided to stop enrollment.
An updated analysis, performed at a median observation time of 26 months, showed there was still no significant difference in overall survival between the treatment arms (P=0.362).
Cause of death
In the intent-to-treat population—30 patients in the auto-SCT arm and 28 in the allo-SCT arm—there were 16 lymphoma-related deaths, 10 in the auto-SCT arm and 6 in the allo-SCT arm.
There were 6 deaths related to study treatment (4 early and 2 late), all in the allo-SCT arm. One patient in the allo-SCT arm died of post-transplant lymphoproliferative disorder, and 1 patient in the same arm died of hemorrhage after salvage. One patient in each arm died as a result of salvage treatment.
Dr Schmitz and his colleagues also looked at the cause of death among patients who received a transplant—19 in the auto-SCT arm and 17 in the allo-SCT arm.
After SCT, there were 7 deaths in each arm. In the auto-SCT arm, there were 6 lymphoma-related deaths and 1 death related to salvage treatment. In the allo-SCT arm, there were 7 cases of non-relapse-related mortality, including 1 patient with post-transplant lymphoproliferative disorder.
“There certainly seems to be a [graft-vs-lymphoma] effect of allo-transplant in T-cell lymphoma that is, unfortunately, in some way, counterbalanced by high transplant-related mortality,” Dr Schmitz said.
He added that results of a final analysis of the 104 patients enrolled on this study should be available in 2017.
*Information in the abstract differs from that presented at the meeting.
© ASCO/Max Gersh
CHICAGO—Allogeneic and autologous transplants produce similar survival rates when used as first-line therapy in younger patients with peripheral
T-cell lymphoma (PTCL), according to interim results of the AATT trial.
The study also showed that deaths among patients who received autologous stem cell transplants (auto-SCTs) were a result of relapse and salvage treatment, while deaths among allogeneic SCT (allo-SCT) recipients were transplant-related.
Norbert Schmitz, MD, PhD, of Asklepios Hospital St. Georg in Hamburg, Germany, presented these findings at the 2015 ASCO Annual Meeting (abstract 8507*).
Dr Schmitz noted that only previous study comparing auto-SCT with allo-SCT as first-line therapy in PTCL was not designed or powered to evaluate the differences between the transplant types.
So he and his colleagues conducted the AATT trial to determine the differences. The team hypothesized that allo-SCT would improve 3-year event-free survival from 35% to 60%, given an α of 5% and a power of 80%. They needed 140 patients to prove or disprove this theory.
Ultimately, the investigators enrolled 104 patients and performed an interim analysis when 58 patients were evaluable for response.
Of the 58 patients, 30 were randomized to the auto-SCT arm and 28 to the allo-SCT arm. Baseline characteristics were similar between the arms, including patients’ median ages (49 and 50, respectively), the proportion of patients with stage III/IV disease (87% and 93%), and the proportion with ECOG status greater than 1 (23% and 18%).
Most patients in both arms had PTCL not otherwise specified (36% in the auto-SCT arm and 50% in the allo-SCT arm). Other subtypes included angioimmunoblastic T-cell lymphoma (23% and 32%, respectively), ALK-negative anaplastic large-cell lymphoma (20% and 4%), and “other” PTCLs (20% and 8%). The other PTCLs were NK/T-cell lymphoma, intestinal T/NK-cell lymphoma, hepatosplenic γδ lymphoma, and subcutaneous panniculitis-like PTCL.
Treatment characteristics
Before undergoing transplant, patients in both arms received treatment with CHOEP (cyclophosphamide, doxorubicin, etoposide, vincristine, and prednisone) on days 1, 15, 29, and 43. If they experienced a complete response (CR), partial response, or no change, patients received DHAP (dexamethasone, cytarabine, and cisplatin) on day 64.
Patients in the auto-SCT arm received BEAM (carmustine, etoposide, cytarabine, and melphalan) prior to transplant. And patients in the allo-SCT arm received FBC (fludarabine, busulfan, and cyclophosphamide).
Overall, 36 patients (62%) completed treatment per protocol, 19 in the auto-SCT arm and 17 in the allo-SCT arm. Thirty-eight percent of all patients could not proceed to transplant per protocol, mostly because of early lymphoma progression.
Response and survival
The researchers observed CRs/unconfirmed CRs (CRus) in 33% (n=10) of patients in the auto-SCT arm and 39% (n=11) in the allo-SCT arm. CR/CRus and progressive disease within 2 months occurred in 3% (n=1) and 4% (n=1) of patients, respectively.
Partial responses were seen in 17% (n=5) of patients in the auto-SCT arm and 7% (n=2) in the allo-SCT arm. There was no change in 7% (n=2) and 0% of patients, respectively. And responses were unknown in 7% (n=2) of patients in the auto-SCT arm.
Progressive disease occurred in 33% (n=10) of patients in the auto-SCT arm and 36% (n=10) in the allo-SCT arm. And treatment-related death occurred in 0% (n=0) and 14% (n=4), respectively.
At the interim analysis, there was no significant difference between the treatment arms with regard to event-free survival (P=0.963) or overall survival (P=0.174).
“At that time, the decision was made to stop the study,” Dr Schmitz said.
He explained that a conditional power analysis showed a low probability that the primary endpoint—a 25% improvement in event-free survival with allo-SCT—could still be met. So the data safety monitoring board decided to stop enrollment.
An updated analysis, performed at a median observation time of 26 months, showed there was still no significant difference in overall survival between the treatment arms (P=0.362).
Cause of death
In the intent-to-treat population—30 patients in the auto-SCT arm and 28 in the allo-SCT arm—there were 16 lymphoma-related deaths, 10 in the auto-SCT arm and 6 in the allo-SCT arm.
There were 6 deaths related to study treatment (4 early and 2 late), all in the allo-SCT arm. One patient in the allo-SCT arm died of post-transplant lymphoproliferative disorder, and 1 patient in the same arm died of hemorrhage after salvage. One patient in each arm died as a result of salvage treatment.
Dr Schmitz and his colleagues also looked at the cause of death among patients who received a transplant—19 in the auto-SCT arm and 17 in the allo-SCT arm.
After SCT, there were 7 deaths in each arm. In the auto-SCT arm, there were 6 lymphoma-related deaths and 1 death related to salvage treatment. In the allo-SCT arm, there were 7 cases of non-relapse-related mortality, including 1 patient with post-transplant lymphoproliferative disorder.
“There certainly seems to be a [graft-vs-lymphoma] effect of allo-transplant in T-cell lymphoma that is, unfortunately, in some way, counterbalanced by high transplant-related mortality,” Dr Schmitz said.
He added that results of a final analysis of the 104 patients enrolled on this study should be available in 2017.
*Information in the abstract differs from that presented at the meeting.