LayerRx Mapping ID
709
Slot System
Featured Buckets
Featured Buckets Admin
Reverse Chronological Sort

FDA approves motixafortide for stem cell mobilization in myeloma

Article Type
Changed
Mon, 09/11/2023 - 17:51

 

The U.S. Food and Drug Administration has approved motixafortide (Aphexda, BioLineRx) in combination with filgrastim (G-CSF) to mobilize hematopoietic stem cells for collection and subsequent autologous transplantation in patients with multiple myeloma.

The success of autologous stem cell transplantation (ASCT) depends on adequate mobilization of stem cells during the treatment process. Collection of a sufficient number of stem cells to perform two transplantations is recommended. However, in up to 47% of patients, collecting target numbers of hematopoietic stem cells for ASCT after one apheresis session remains a challenge, BioLineRx explained in a press release today, announcing the approval.

The goal of combining motixafortide with filgrastim is to mobilize stem cells more reliably than filgrastim can alone, with fewer days of apheresis sessions and fewer doses of filgrastim.

“We believe [motixafortide] will play a critical role in addressing unmet needs and introduce a new treatment paradigm for” patients with multiple myeloma, CEO Philip Serlin said in the release.

The drug approval was based on the GENESIS trial, which randomized 122 patients to either motixafortide plus filgrastim or placebo plus filgrastim.

BioLineRx said the trial included patients considered representative of the typical multiple myeloma population undergoing ASCT, with a median age of 63 years and with about 70% of patients in both arms of the trial receiving lenalidomide-containing induction therapy.

Motixafortide plus filgrastim enabled 67.5% of patients to achieve the stem cell collection goal of 6 million or more CD34+ cells/kg within two apheresis sessions, versus 9.5% of patients receiving the placebo plus filgrastim regimen. Additionally, 92.5% of patients reached the stem cell collection goal in up to two apheresis sessions in the motixafortide arm and 21.4% in the placebo arm.

However, “the data are descriptive and were not statistically powered nor prespecified. The information should be cautiously interpreted,” the company said.

Serious adverse reactions occurred in 5.4% of patients in the motixafortide arm, including vomiting, injection-site reaction, hypersensitivity reaction, injection-site cellulitis, hypokalemia, and hypoxia. The most common adverse reactions, occurring in more than 20% of patients, were injection site reactions (pain, erythema, and pruritus), pruritus, flushing, and back pain.

Labeling for the subcutaneous injection is available online.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

The U.S. Food and Drug Administration has approved motixafortide (Aphexda, BioLineRx) in combination with filgrastim (G-CSF) to mobilize hematopoietic stem cells for collection and subsequent autologous transplantation in patients with multiple myeloma.

The success of autologous stem cell transplantation (ASCT) depends on adequate mobilization of stem cells during the treatment process. Collection of a sufficient number of stem cells to perform two transplantations is recommended. However, in up to 47% of patients, collecting target numbers of hematopoietic stem cells for ASCT after one apheresis session remains a challenge, BioLineRx explained in a press release today, announcing the approval.

The goal of combining motixafortide with filgrastim is to mobilize stem cells more reliably than filgrastim can alone, with fewer days of apheresis sessions and fewer doses of filgrastim.

“We believe [motixafortide] will play a critical role in addressing unmet needs and introduce a new treatment paradigm for” patients with multiple myeloma, CEO Philip Serlin said in the release.

The drug approval was based on the GENESIS trial, which randomized 122 patients to either motixafortide plus filgrastim or placebo plus filgrastim.

BioLineRx said the trial included patients considered representative of the typical multiple myeloma population undergoing ASCT, with a median age of 63 years and with about 70% of patients in both arms of the trial receiving lenalidomide-containing induction therapy.

Motixafortide plus filgrastim enabled 67.5% of patients to achieve the stem cell collection goal of 6 million or more CD34+ cells/kg within two apheresis sessions, versus 9.5% of patients receiving the placebo plus filgrastim regimen. Additionally, 92.5% of patients reached the stem cell collection goal in up to two apheresis sessions in the motixafortide arm and 21.4% in the placebo arm.

However, “the data are descriptive and were not statistically powered nor prespecified. The information should be cautiously interpreted,” the company said.

Serious adverse reactions occurred in 5.4% of patients in the motixafortide arm, including vomiting, injection-site reaction, hypersensitivity reaction, injection-site cellulitis, hypokalemia, and hypoxia. The most common adverse reactions, occurring in more than 20% of patients, were injection site reactions (pain, erythema, and pruritus), pruritus, flushing, and back pain.

Labeling for the subcutaneous injection is available online.

A version of this article first appeared on Medscape.com.

 

The U.S. Food and Drug Administration has approved motixafortide (Aphexda, BioLineRx) in combination with filgrastim (G-CSF) to mobilize hematopoietic stem cells for collection and subsequent autologous transplantation in patients with multiple myeloma.

The success of autologous stem cell transplantation (ASCT) depends on adequate mobilization of stem cells during the treatment process. Collection of a sufficient number of stem cells to perform two transplantations is recommended. However, in up to 47% of patients, collecting target numbers of hematopoietic stem cells for ASCT after one apheresis session remains a challenge, BioLineRx explained in a press release today, announcing the approval.

The goal of combining motixafortide with filgrastim is to mobilize stem cells more reliably than filgrastim can alone, with fewer days of apheresis sessions and fewer doses of filgrastim.

“We believe [motixafortide] will play a critical role in addressing unmet needs and introduce a new treatment paradigm for” patients with multiple myeloma, CEO Philip Serlin said in the release.

The drug approval was based on the GENESIS trial, which randomized 122 patients to either motixafortide plus filgrastim or placebo plus filgrastim.

BioLineRx said the trial included patients considered representative of the typical multiple myeloma population undergoing ASCT, with a median age of 63 years and with about 70% of patients in both arms of the trial receiving lenalidomide-containing induction therapy.

Motixafortide plus filgrastim enabled 67.5% of patients to achieve the stem cell collection goal of 6 million or more CD34+ cells/kg within two apheresis sessions, versus 9.5% of patients receiving the placebo plus filgrastim regimen. Additionally, 92.5% of patients reached the stem cell collection goal in up to two apheresis sessions in the motixafortide arm and 21.4% in the placebo arm.

However, “the data are descriptive and were not statistically powered nor prespecified. The information should be cautiously interpreted,” the company said.

Serious adverse reactions occurred in 5.4% of patients in the motixafortide arm, including vomiting, injection-site reaction, hypersensitivity reaction, injection-site cellulitis, hypokalemia, and hypoxia. The most common adverse reactions, occurring in more than 20% of patients, were injection site reactions (pain, erythema, and pruritus), pruritus, flushing, and back pain.

Labeling for the subcutaneous injection is available online.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Is additional treatment needed, pretransplant, for r/r AML?

Article Type
Changed
Mon, 09/11/2023 - 18:33

Should patients with acute myeloid leukemia (AML) for whom induction therapy fails to induce complete remission proceed to allogeneic hematopoietic stem cell transplant anyway? Or do these patients fare better when they receive an intensive salvage induction regimen to bring them into remission before transplant?

This critically important question was debated at the annual meeting of the Society of Hematologic Oncology, held in Houston and online.

Johannes Schetelig, MD, argued in favor of proceeding to transplant, even without a complete remission.

“In the past, I’ve told many patients with relapsed or refractory AML that we do need to induce a [complete remission] prior to transplantation,” said Dr. Schetelig, from the Clinical Trials Unit at DKMS in Dresden, Germany. “But is it true?”

According to findings from a recent randomized trial, it may not be. The trial, led by Dr. Schetelig, found that patients with AML who received immediate allogeneic transplant without first having achieved a complete response following induction therapy did just as well as those who received intensive salvage induction therapy to establish remission before transplant.

If this finding holds, it “completely upends” how experts have traditionally approached patients with AML, Mikkael A. Sekeres, MD, of the University of Miami said at a conference press briefing last year.

The phase 3 ASAP trial, presented at last year’s American Society of Hematology meeting, included patients with AML who had had a poor response or who had experienced a relapse after first induction therapy. Patients were randomly assigned to a remission-induction strategy prior to allogeneic stem cell transplant (alloHCT) or a disease-control approach of watchful waiting followed by sequential conditioning and alloHCT. The primary endpoint was treatment success, defined as a complete response at day 56 following alloHCT.

In an intention-to-treat analysis, 83.5% of patients in the disease-control group and 81% in the remission-induction group achieved treatment success. Similarly, in the per-protocol analysis, 84.1% and 81.3%, respectively, achieved a complete response at day 56 after alloHCT. After a median follow-up of 4 years, there were no differences in leukemia-free survival or overall survival between the two groups.

Another advantage to forgoing an intensive salvage induction regimen: Patients in the disease-control arm experienced significantly fewer severe adverse events (23% vs. 64% in the remission induction arm) and spent a mean of 27 fewer days in the hospital prior to transplantation.

At last year’s press briefing, Dr. Schetelig said his team did not expect that a complete response on day 56 after transplantation would translate into “equal long-term benefit” for these groups. “This is what I was really astonished about,” he said.

Delving further into the findings, Dr. Schetelig explained that in the remission-induction arm patients who had had a complete response prior to transplantation demonstrated significantly better overall survival at 4 years than those who had not had a complete response at that point: 60% vs. 40%.

The study also revealed that in the disease-control arm, for patients under watchful waiting who did not need low-dose cytarabine and mitoxantrone for disease control, overall survival outcomes were similar to those of patients in the remission-induction arm who achieved a complete response.

These findings suggest that patients who can be bridged with watchful waiting may have a more favorable disease biology, and chemosensitivity could just be a biomarker for disease biology. In other words, “AML biology matters for transplant outcome and not tumor load,” Dr. Schetelig explained.

recent study that found that having minimal residual disease (MRD) prior to transplant “had no independent effect on leukemia-free survival” supports this idea, he added.

Overall, Dr. Schetelig concluded that data from the ASAP trial suggest that watchful waiting prior to alloHCT represents “an alternative” for some patients.
 

 

 

Counterpoint: Aim for complete remission

Ronald B. Walter, MD, PhD, argued the counterpoint: that residual disease before transplantation is associated with worse posttransplant outcomes and represents a meaningful pretransplant therapeutic target.

The goal of intensifying treatment for patients with residual disease is to erase disease vestiges prior to transplantation.

“The idea is that by doing so you might optimize the benefit-to-risk ratio and ultimately improve outcomes,” said Dr. Walter, of the translational science and therapeutics division at the Fred Hutchinson Cancer Research Center in Seattle.

Several reports support this view that patients who are MRD negative at the time of transplant have significantly better survival outcomes than patients with residual disease who undergo transplant.

2016 study from Dr. Walter and colleagues at Fred Hutchinson, for instance, found that 3-year overall survival was significantly higher among patients with no MRD who underwent myeloablative alloHCT: 73% vs. 26% of those in MRD-positive morphologic remission and 23% of patients with active AML.

Another study, published the year before by a different research team, also revealed that “adult patients with AML in morphologic [complete remission] but with detectable MRD who undergo alloHCT have poor outcomes, which approximates those who undergo transplantation with active disease,” the authors of the 2015 study wrote in a commentary highlighting findings from both studies.

Still, providing intensive therapy prior to transplant comes with drawbacks, Dr. Walter noted. These downsides include potential toxicity from more intense therapy, which may prevent further therapy with curative intent, as well as the possibility that deintensifying therapy could lead to difficult-to-treat relapse.

It may, however, be possible to reduce the intensity of therapy before transplant and still achieve good outcomes after transplant, though the data remain mixed.

One trial found that a reduced-intensity conditioning regimen was associated with a greater risk of relapse post transplant and worse overall survival, compared with standard myeloablative conditioning.

However, another recent trial in which patients with AML or high-risk myelodysplastic syndrome were randomly assigned to either a reduced-intensity conditioning regimen or an intensified version of that regimen prior to transplant demonstrated no difference in relapse rates and overall survival, regardless of patients’ MRD status prior to transplant.

“To me, it’s still key to go into transplant with as little disease as possible,” Dr. Walter said. How much value there is in targeted treatment to further reduce disease burden prior to transplant “will really require further careful study,” he said.

The ASAP trial was sponsored by DKMS. Dr. Schetelig has received honoraria from BeiGene, BMS, Janssen, AstraZeneca, AbbVie, and DKMS. Dr. Walter reported no relevant financial relationships.

A version of this article appeared on Medscape.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Should patients with acute myeloid leukemia (AML) for whom induction therapy fails to induce complete remission proceed to allogeneic hematopoietic stem cell transplant anyway? Or do these patients fare better when they receive an intensive salvage induction regimen to bring them into remission before transplant?

This critically important question was debated at the annual meeting of the Society of Hematologic Oncology, held in Houston and online.

Johannes Schetelig, MD, argued in favor of proceeding to transplant, even without a complete remission.

“In the past, I’ve told many patients with relapsed or refractory AML that we do need to induce a [complete remission] prior to transplantation,” said Dr. Schetelig, from the Clinical Trials Unit at DKMS in Dresden, Germany. “But is it true?”

According to findings from a recent randomized trial, it may not be. The trial, led by Dr. Schetelig, found that patients with AML who received immediate allogeneic transplant without first having achieved a complete response following induction therapy did just as well as those who received intensive salvage induction therapy to establish remission before transplant.

If this finding holds, it “completely upends” how experts have traditionally approached patients with AML, Mikkael A. Sekeres, MD, of the University of Miami said at a conference press briefing last year.

The phase 3 ASAP trial, presented at last year’s American Society of Hematology meeting, included patients with AML who had had a poor response or who had experienced a relapse after first induction therapy. Patients were randomly assigned to a remission-induction strategy prior to allogeneic stem cell transplant (alloHCT) or a disease-control approach of watchful waiting followed by sequential conditioning and alloHCT. The primary endpoint was treatment success, defined as a complete response at day 56 following alloHCT.

In an intention-to-treat analysis, 83.5% of patients in the disease-control group and 81% in the remission-induction group achieved treatment success. Similarly, in the per-protocol analysis, 84.1% and 81.3%, respectively, achieved a complete response at day 56 after alloHCT. After a median follow-up of 4 years, there were no differences in leukemia-free survival or overall survival between the two groups.

Another advantage to forgoing an intensive salvage induction regimen: Patients in the disease-control arm experienced significantly fewer severe adverse events (23% vs. 64% in the remission induction arm) and spent a mean of 27 fewer days in the hospital prior to transplantation.

At last year’s press briefing, Dr. Schetelig said his team did not expect that a complete response on day 56 after transplantation would translate into “equal long-term benefit” for these groups. “This is what I was really astonished about,” he said.

Delving further into the findings, Dr. Schetelig explained that in the remission-induction arm patients who had had a complete response prior to transplantation demonstrated significantly better overall survival at 4 years than those who had not had a complete response at that point: 60% vs. 40%.

The study also revealed that in the disease-control arm, for patients under watchful waiting who did not need low-dose cytarabine and mitoxantrone for disease control, overall survival outcomes were similar to those of patients in the remission-induction arm who achieved a complete response.

These findings suggest that patients who can be bridged with watchful waiting may have a more favorable disease biology, and chemosensitivity could just be a biomarker for disease biology. In other words, “AML biology matters for transplant outcome and not tumor load,” Dr. Schetelig explained.

recent study that found that having minimal residual disease (MRD) prior to transplant “had no independent effect on leukemia-free survival” supports this idea, he added.

Overall, Dr. Schetelig concluded that data from the ASAP trial suggest that watchful waiting prior to alloHCT represents “an alternative” for some patients.
 

 

 

Counterpoint: Aim for complete remission

Ronald B. Walter, MD, PhD, argued the counterpoint: that residual disease before transplantation is associated with worse posttransplant outcomes and represents a meaningful pretransplant therapeutic target.

The goal of intensifying treatment for patients with residual disease is to erase disease vestiges prior to transplantation.

“The idea is that by doing so you might optimize the benefit-to-risk ratio and ultimately improve outcomes,” said Dr. Walter, of the translational science and therapeutics division at the Fred Hutchinson Cancer Research Center in Seattle.

Several reports support this view that patients who are MRD negative at the time of transplant have significantly better survival outcomes than patients with residual disease who undergo transplant.

2016 study from Dr. Walter and colleagues at Fred Hutchinson, for instance, found that 3-year overall survival was significantly higher among patients with no MRD who underwent myeloablative alloHCT: 73% vs. 26% of those in MRD-positive morphologic remission and 23% of patients with active AML.

Another study, published the year before by a different research team, also revealed that “adult patients with AML in morphologic [complete remission] but with detectable MRD who undergo alloHCT have poor outcomes, which approximates those who undergo transplantation with active disease,” the authors of the 2015 study wrote in a commentary highlighting findings from both studies.

Still, providing intensive therapy prior to transplant comes with drawbacks, Dr. Walter noted. These downsides include potential toxicity from more intense therapy, which may prevent further therapy with curative intent, as well as the possibility that deintensifying therapy could lead to difficult-to-treat relapse.

It may, however, be possible to reduce the intensity of therapy before transplant and still achieve good outcomes after transplant, though the data remain mixed.

One trial found that a reduced-intensity conditioning regimen was associated with a greater risk of relapse post transplant and worse overall survival, compared with standard myeloablative conditioning.

However, another recent trial in which patients with AML or high-risk myelodysplastic syndrome were randomly assigned to either a reduced-intensity conditioning regimen or an intensified version of that regimen prior to transplant demonstrated no difference in relapse rates and overall survival, regardless of patients’ MRD status prior to transplant.

“To me, it’s still key to go into transplant with as little disease as possible,” Dr. Walter said. How much value there is in targeted treatment to further reduce disease burden prior to transplant “will really require further careful study,” he said.

The ASAP trial was sponsored by DKMS. Dr. Schetelig has received honoraria from BeiGene, BMS, Janssen, AstraZeneca, AbbVie, and DKMS. Dr. Walter reported no relevant financial relationships.

A version of this article appeared on Medscape.com.

Should patients with acute myeloid leukemia (AML) for whom induction therapy fails to induce complete remission proceed to allogeneic hematopoietic stem cell transplant anyway? Or do these patients fare better when they receive an intensive salvage induction regimen to bring them into remission before transplant?

This critically important question was debated at the annual meeting of the Society of Hematologic Oncology, held in Houston and online.

Johannes Schetelig, MD, argued in favor of proceeding to transplant, even without a complete remission.

“In the past, I’ve told many patients with relapsed or refractory AML that we do need to induce a [complete remission] prior to transplantation,” said Dr. Schetelig, from the Clinical Trials Unit at DKMS in Dresden, Germany. “But is it true?”

According to findings from a recent randomized trial, it may not be. The trial, led by Dr. Schetelig, found that patients with AML who received immediate allogeneic transplant without first having achieved a complete response following induction therapy did just as well as those who received intensive salvage induction therapy to establish remission before transplant.

If this finding holds, it “completely upends” how experts have traditionally approached patients with AML, Mikkael A. Sekeres, MD, of the University of Miami said at a conference press briefing last year.

The phase 3 ASAP trial, presented at last year’s American Society of Hematology meeting, included patients with AML who had had a poor response or who had experienced a relapse after first induction therapy. Patients were randomly assigned to a remission-induction strategy prior to allogeneic stem cell transplant (alloHCT) or a disease-control approach of watchful waiting followed by sequential conditioning and alloHCT. The primary endpoint was treatment success, defined as a complete response at day 56 following alloHCT.

In an intention-to-treat analysis, 83.5% of patients in the disease-control group and 81% in the remission-induction group achieved treatment success. Similarly, in the per-protocol analysis, 84.1% and 81.3%, respectively, achieved a complete response at day 56 after alloHCT. After a median follow-up of 4 years, there were no differences in leukemia-free survival or overall survival between the two groups.

Another advantage to forgoing an intensive salvage induction regimen: Patients in the disease-control arm experienced significantly fewer severe adverse events (23% vs. 64% in the remission induction arm) and spent a mean of 27 fewer days in the hospital prior to transplantation.

At last year’s press briefing, Dr. Schetelig said his team did not expect that a complete response on day 56 after transplantation would translate into “equal long-term benefit” for these groups. “This is what I was really astonished about,” he said.

Delving further into the findings, Dr. Schetelig explained that in the remission-induction arm patients who had had a complete response prior to transplantation demonstrated significantly better overall survival at 4 years than those who had not had a complete response at that point: 60% vs. 40%.

The study also revealed that in the disease-control arm, for patients under watchful waiting who did not need low-dose cytarabine and mitoxantrone for disease control, overall survival outcomes were similar to those of patients in the remission-induction arm who achieved a complete response.

These findings suggest that patients who can be bridged with watchful waiting may have a more favorable disease biology, and chemosensitivity could just be a biomarker for disease biology. In other words, “AML biology matters for transplant outcome and not tumor load,” Dr. Schetelig explained.

recent study that found that having minimal residual disease (MRD) prior to transplant “had no independent effect on leukemia-free survival” supports this idea, he added.

Overall, Dr. Schetelig concluded that data from the ASAP trial suggest that watchful waiting prior to alloHCT represents “an alternative” for some patients.
 

 

 

Counterpoint: Aim for complete remission

Ronald B. Walter, MD, PhD, argued the counterpoint: that residual disease before transplantation is associated with worse posttransplant outcomes and represents a meaningful pretransplant therapeutic target.

The goal of intensifying treatment for patients with residual disease is to erase disease vestiges prior to transplantation.

“The idea is that by doing so you might optimize the benefit-to-risk ratio and ultimately improve outcomes,” said Dr. Walter, of the translational science and therapeutics division at the Fred Hutchinson Cancer Research Center in Seattle.

Several reports support this view that patients who are MRD negative at the time of transplant have significantly better survival outcomes than patients with residual disease who undergo transplant.

2016 study from Dr. Walter and colleagues at Fred Hutchinson, for instance, found that 3-year overall survival was significantly higher among patients with no MRD who underwent myeloablative alloHCT: 73% vs. 26% of those in MRD-positive morphologic remission and 23% of patients with active AML.

Another study, published the year before by a different research team, also revealed that “adult patients with AML in morphologic [complete remission] but with detectable MRD who undergo alloHCT have poor outcomes, which approximates those who undergo transplantation with active disease,” the authors of the 2015 study wrote in a commentary highlighting findings from both studies.

Still, providing intensive therapy prior to transplant comes with drawbacks, Dr. Walter noted. These downsides include potential toxicity from more intense therapy, which may prevent further therapy with curative intent, as well as the possibility that deintensifying therapy could lead to difficult-to-treat relapse.

It may, however, be possible to reduce the intensity of therapy before transplant and still achieve good outcomes after transplant, though the data remain mixed.

One trial found that a reduced-intensity conditioning regimen was associated with a greater risk of relapse post transplant and worse overall survival, compared with standard myeloablative conditioning.

However, another recent trial in which patients with AML or high-risk myelodysplastic syndrome were randomly assigned to either a reduced-intensity conditioning regimen or an intensified version of that regimen prior to transplant demonstrated no difference in relapse rates and overall survival, regardless of patients’ MRD status prior to transplant.

“To me, it’s still key to go into transplant with as little disease as possible,” Dr. Walter said. How much value there is in targeted treatment to further reduce disease burden prior to transplant “will really require further careful study,” he said.

The ASAP trial was sponsored by DKMS. Dr. Schetelig has received honoraria from BeiGene, BMS, Janssen, AstraZeneca, AbbVie, and DKMS. Dr. Walter reported no relevant financial relationships.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM SOHO 2023

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Post-SCT, better survival in children with healthy gut diversity

Article Type
Changed
Tue, 09/05/2023 - 20:08

Pediatric patients receiving donor stem cell transplantion with healthier pretransplant gut microbiota diversity show improved rates of survival and a lower risk of developing acute graft versus host disease (GvHD), similar to the patterns reported in adults.

“To the best of our knowledge, we present the first evidence of an association between pretransplantation lower gut microbiota diversity and poorer outcome in children undergoing allo-HSCT,” the authors report, in research published in the journal Blood. “Our findings underscore the importance of pre-transplant gut microbiota diversity and compositional structure in influencing allo-HSCT-related clinical outcomes in the pediatric setting.”

While allogeneic hematopoietic stem cell transplantation (allo-HSCT) can be potentially curative of hematologic malignancies, the stem cell transplantation process can wreak havoc on gut microbiota, because of factors including the conditioning regimen, antibiotic exposure, and dietary changes.

Specifically, the process can cause a substantial decrease in necessary alpha diversity and a potential expansion of possibly pathogenic bacteria.

While poor gut microbiota diversity has been linked to higher mortality in adult patients receiving allo-HSCT, research on the effects in pediatric patients is lacking.

“The gut microbiota of children differs from adults’ one, and this accounts for the need for specific pediatric studies on the gut microbiota-to–allo-HSCT relationship,” the authors write.

For the multicenter study, first author Riccardo Masetti, MD, PhD, of the department of pediatric oncology and hematology at the University of Bologna, Italy, and colleagues analyzed the gut microbiota diversity of 90 pediatric allo-HSCT recipients at four centers in Italy and one in Poland, stratifying the patients into groups of higher and lower diversity pretransplantation and again at the time of neutrophil engraftment.

Overall, gut microbiota diversity significantly declined from before allo-HSCT to afterward, at the time of neutrophil engraftment (P < .0001), with lower diversity observed in patients 3 years of age or younger.

With a median follow-up of 52 months, compared with the lower diversity group, those with higher diversity prior to transplantation had a significantly higher probability of overall survival (hazard ratio, 0.26; P = .011), after adjustment for age, graft source, donor type, intensity of conditioning regimen, center, and type of disease, with estimated overall survival at 52 months after allo-HSCT of 88.9% for the higher diversity group and 62.7% for the lower diversity group.

The cumulative incidence of grade II-IV acute GvHD was significantly lower for the higher diversity group versus lower diversity (20.0 versus 44.4, respectively; P = .017), as were the incidence rates of grade III-IV acute GvHD (2.2 versus 20.0; P = .007).

There were, however, no significant differences between the low and high diversity gut microbiota groups in relapse-free survival (P = .091).

The higher diversity group notably had higher relative abundances of potentially health-related bacterial families, including Ruminococcaceae and Oscillospiraceae, while the lower diversity group showed an overabundance of Enterococcaceae and Enterobacteriaceae.

Of note, the results differ from those observed in adults, among whom gut microbiota diversity before as well as after transplantation has been significantly associated with transplant outcomes, whereas with children, the association was limited to diversity prior to transplant.

In general, children have significantly lower diversity of gut microbiota than adults, with varying functional properties, and microbiota that is more easily modified by environmental factors, with larger changes occurring upon exposure to external stressors, the authors explain.

“Considering these different ecological properties compared to adults, we hypothesize that allo-HSCT–induced dysbiosis in the pediatric setting may imply loss of age-related gut microbiota signatures, including alpha diversity, with high interpatient variability,” they say.

Characteristics that were associated with higher or lower gut microbiota diversity prior to allo-HSCT included the treating center, suggesting that the geographical region may affect the diversity and the type of antibiotic exposure prior to the transplant.

Limitations included that “we didn’t assess other pretransplant characteristics such as the type of chemotherapy received, or the lifestyle, and this should be addressed in future studies on larger cohorts,” Dr. Masetti said in an interview.

While lengthy delays in screening of samples are barriers in the use of the gut microbiome as a tool in clinical practice, he noted that clinicians can take key measures to improve the microbiota.

“[Preventive measures] include the avoidance of unnecessary antibiotic treatment, which has a detrimental effect on the microbiota,” he said. “Moreover, some dietary changes may promote microbiota health.”

In addition, key measures can be taken during the allo-HSCT to preserve the microbiota, he added.

“In our center, we use enteral nutrition with a nasogastric tube rather than parenteral nutrition, which helps the microbiota to recover faster,” Dr. Masetti explained. “Moreover, other interventional measures such as fecal microbiota transplantation or the use of probiotics are under testing.”

“In particular, our data emphasize the importance of an overall healthy network, rather than the abundance of specific families or genera, in preventing complications and unfavorable outcomes.”

Commenting on the study, Robert Jenq, MD, an assistant professor in the departments of genomic medicine and stem cell transplantation and cellular therapy at the University of Texas M.D. Anderson Cancer Center, Houston, noted that with the growing evidence of the effects of poor gut microbiota diversity on clinical outcomes, multiple early-phase clinical trials are being conducted to test various strategies to prevent or treat gut injury.

“I’m not aware of any one approach that has shown enough promise to warrant being tested in multicenter studies yet, but it’s still a bit early,” Dr. Jenq said.“In the meantime, discontinuing or de-escalating antibiotics when medically safe, and encouraging patients to eat as much as they’re able to is a reasonable recommendation.”

Dr. Jenq added that, with most of the data on the issue being retrospective, a causative role has not been established, and “the finding of an association between the gut microbiota composition and survival, while interesting and provocative, does not provide evidence that intervening on the gut microbiota will lead to a clinical benefit.”

“I’m hopeful that randomized clinical trials will eventually demonstrate that we can protect or restore the gut microbiota, and this will lead to substantial clinical benefits, but this remains to be seen,” he said.

The authors had no disclosures to report. Dr. Jenq is an advisor for Seres Therapeutics, Prolacta Biosciences, and MaaT Pharma.

Publications
Topics
Sections

Pediatric patients receiving donor stem cell transplantion with healthier pretransplant gut microbiota diversity show improved rates of survival and a lower risk of developing acute graft versus host disease (GvHD), similar to the patterns reported in adults.

“To the best of our knowledge, we present the first evidence of an association between pretransplantation lower gut microbiota diversity and poorer outcome in children undergoing allo-HSCT,” the authors report, in research published in the journal Blood. “Our findings underscore the importance of pre-transplant gut microbiota diversity and compositional structure in influencing allo-HSCT-related clinical outcomes in the pediatric setting.”

While allogeneic hematopoietic stem cell transplantation (allo-HSCT) can be potentially curative of hematologic malignancies, the stem cell transplantation process can wreak havoc on gut microbiota, because of factors including the conditioning regimen, antibiotic exposure, and dietary changes.

Specifically, the process can cause a substantial decrease in necessary alpha diversity and a potential expansion of possibly pathogenic bacteria.

While poor gut microbiota diversity has been linked to higher mortality in adult patients receiving allo-HSCT, research on the effects in pediatric patients is lacking.

“The gut microbiota of children differs from adults’ one, and this accounts for the need for specific pediatric studies on the gut microbiota-to–allo-HSCT relationship,” the authors write.

For the multicenter study, first author Riccardo Masetti, MD, PhD, of the department of pediatric oncology and hematology at the University of Bologna, Italy, and colleagues analyzed the gut microbiota diversity of 90 pediatric allo-HSCT recipients at four centers in Italy and one in Poland, stratifying the patients into groups of higher and lower diversity pretransplantation and again at the time of neutrophil engraftment.

Overall, gut microbiota diversity significantly declined from before allo-HSCT to afterward, at the time of neutrophil engraftment (P < .0001), with lower diversity observed in patients 3 years of age or younger.

With a median follow-up of 52 months, compared with the lower diversity group, those with higher diversity prior to transplantation had a significantly higher probability of overall survival (hazard ratio, 0.26; P = .011), after adjustment for age, graft source, donor type, intensity of conditioning regimen, center, and type of disease, with estimated overall survival at 52 months after allo-HSCT of 88.9% for the higher diversity group and 62.7% for the lower diversity group.

The cumulative incidence of grade II-IV acute GvHD was significantly lower for the higher diversity group versus lower diversity (20.0 versus 44.4, respectively; P = .017), as were the incidence rates of grade III-IV acute GvHD (2.2 versus 20.0; P = .007).

There were, however, no significant differences between the low and high diversity gut microbiota groups in relapse-free survival (P = .091).

The higher diversity group notably had higher relative abundances of potentially health-related bacterial families, including Ruminococcaceae and Oscillospiraceae, while the lower diversity group showed an overabundance of Enterococcaceae and Enterobacteriaceae.

Of note, the results differ from those observed in adults, among whom gut microbiota diversity before as well as after transplantation has been significantly associated with transplant outcomes, whereas with children, the association was limited to diversity prior to transplant.

In general, children have significantly lower diversity of gut microbiota than adults, with varying functional properties, and microbiota that is more easily modified by environmental factors, with larger changes occurring upon exposure to external stressors, the authors explain.

“Considering these different ecological properties compared to adults, we hypothesize that allo-HSCT–induced dysbiosis in the pediatric setting may imply loss of age-related gut microbiota signatures, including alpha diversity, with high interpatient variability,” they say.

Characteristics that were associated with higher or lower gut microbiota diversity prior to allo-HSCT included the treating center, suggesting that the geographical region may affect the diversity and the type of antibiotic exposure prior to the transplant.

Limitations included that “we didn’t assess other pretransplant characteristics such as the type of chemotherapy received, or the lifestyle, and this should be addressed in future studies on larger cohorts,” Dr. Masetti said in an interview.

While lengthy delays in screening of samples are barriers in the use of the gut microbiome as a tool in clinical practice, he noted that clinicians can take key measures to improve the microbiota.

“[Preventive measures] include the avoidance of unnecessary antibiotic treatment, which has a detrimental effect on the microbiota,” he said. “Moreover, some dietary changes may promote microbiota health.”

In addition, key measures can be taken during the allo-HSCT to preserve the microbiota, he added.

“In our center, we use enteral nutrition with a nasogastric tube rather than parenteral nutrition, which helps the microbiota to recover faster,” Dr. Masetti explained. “Moreover, other interventional measures such as fecal microbiota transplantation or the use of probiotics are under testing.”

“In particular, our data emphasize the importance of an overall healthy network, rather than the abundance of specific families or genera, in preventing complications and unfavorable outcomes.”

Commenting on the study, Robert Jenq, MD, an assistant professor in the departments of genomic medicine and stem cell transplantation and cellular therapy at the University of Texas M.D. Anderson Cancer Center, Houston, noted that with the growing evidence of the effects of poor gut microbiota diversity on clinical outcomes, multiple early-phase clinical trials are being conducted to test various strategies to prevent or treat gut injury.

“I’m not aware of any one approach that has shown enough promise to warrant being tested in multicenter studies yet, but it’s still a bit early,” Dr. Jenq said.“In the meantime, discontinuing or de-escalating antibiotics when medically safe, and encouraging patients to eat as much as they’re able to is a reasonable recommendation.”

Dr. Jenq added that, with most of the data on the issue being retrospective, a causative role has not been established, and “the finding of an association between the gut microbiota composition and survival, while interesting and provocative, does not provide evidence that intervening on the gut microbiota will lead to a clinical benefit.”

“I’m hopeful that randomized clinical trials will eventually demonstrate that we can protect or restore the gut microbiota, and this will lead to substantial clinical benefits, but this remains to be seen,” he said.

The authors had no disclosures to report. Dr. Jenq is an advisor for Seres Therapeutics, Prolacta Biosciences, and MaaT Pharma.

Pediatric patients receiving donor stem cell transplantion with healthier pretransplant gut microbiota diversity show improved rates of survival and a lower risk of developing acute graft versus host disease (GvHD), similar to the patterns reported in adults.

“To the best of our knowledge, we present the first evidence of an association between pretransplantation lower gut microbiota diversity and poorer outcome in children undergoing allo-HSCT,” the authors report, in research published in the journal Blood. “Our findings underscore the importance of pre-transplant gut microbiota diversity and compositional structure in influencing allo-HSCT-related clinical outcomes in the pediatric setting.”

While allogeneic hematopoietic stem cell transplantation (allo-HSCT) can be potentially curative of hematologic malignancies, the stem cell transplantation process can wreak havoc on gut microbiota, because of factors including the conditioning regimen, antibiotic exposure, and dietary changes.

Specifically, the process can cause a substantial decrease in necessary alpha diversity and a potential expansion of possibly pathogenic bacteria.

While poor gut microbiota diversity has been linked to higher mortality in adult patients receiving allo-HSCT, research on the effects in pediatric patients is lacking.

“The gut microbiota of children differs from adults’ one, and this accounts for the need for specific pediatric studies on the gut microbiota-to–allo-HSCT relationship,” the authors write.

For the multicenter study, first author Riccardo Masetti, MD, PhD, of the department of pediatric oncology and hematology at the University of Bologna, Italy, and colleagues analyzed the gut microbiota diversity of 90 pediatric allo-HSCT recipients at four centers in Italy and one in Poland, stratifying the patients into groups of higher and lower diversity pretransplantation and again at the time of neutrophil engraftment.

Overall, gut microbiota diversity significantly declined from before allo-HSCT to afterward, at the time of neutrophil engraftment (P < .0001), with lower diversity observed in patients 3 years of age or younger.

With a median follow-up of 52 months, compared with the lower diversity group, those with higher diversity prior to transplantation had a significantly higher probability of overall survival (hazard ratio, 0.26; P = .011), after adjustment for age, graft source, donor type, intensity of conditioning regimen, center, and type of disease, with estimated overall survival at 52 months after allo-HSCT of 88.9% for the higher diversity group and 62.7% for the lower diversity group.

The cumulative incidence of grade II-IV acute GvHD was significantly lower for the higher diversity group versus lower diversity (20.0 versus 44.4, respectively; P = .017), as were the incidence rates of grade III-IV acute GvHD (2.2 versus 20.0; P = .007).

There were, however, no significant differences between the low and high diversity gut microbiota groups in relapse-free survival (P = .091).

The higher diversity group notably had higher relative abundances of potentially health-related bacterial families, including Ruminococcaceae and Oscillospiraceae, while the lower diversity group showed an overabundance of Enterococcaceae and Enterobacteriaceae.

Of note, the results differ from those observed in adults, among whom gut microbiota diversity before as well as after transplantation has been significantly associated with transplant outcomes, whereas with children, the association was limited to diversity prior to transplant.

In general, children have significantly lower diversity of gut microbiota than adults, with varying functional properties, and microbiota that is more easily modified by environmental factors, with larger changes occurring upon exposure to external stressors, the authors explain.

“Considering these different ecological properties compared to adults, we hypothesize that allo-HSCT–induced dysbiosis in the pediatric setting may imply loss of age-related gut microbiota signatures, including alpha diversity, with high interpatient variability,” they say.

Characteristics that were associated with higher or lower gut microbiota diversity prior to allo-HSCT included the treating center, suggesting that the geographical region may affect the diversity and the type of antibiotic exposure prior to the transplant.

Limitations included that “we didn’t assess other pretransplant characteristics such as the type of chemotherapy received, or the lifestyle, and this should be addressed in future studies on larger cohorts,” Dr. Masetti said in an interview.

While lengthy delays in screening of samples are barriers in the use of the gut microbiome as a tool in clinical practice, he noted that clinicians can take key measures to improve the microbiota.

“[Preventive measures] include the avoidance of unnecessary antibiotic treatment, which has a detrimental effect on the microbiota,” he said. “Moreover, some dietary changes may promote microbiota health.”

In addition, key measures can be taken during the allo-HSCT to preserve the microbiota, he added.

“In our center, we use enteral nutrition with a nasogastric tube rather than parenteral nutrition, which helps the microbiota to recover faster,” Dr. Masetti explained. “Moreover, other interventional measures such as fecal microbiota transplantation or the use of probiotics are under testing.”

“In particular, our data emphasize the importance of an overall healthy network, rather than the abundance of specific families or genera, in preventing complications and unfavorable outcomes.”

Commenting on the study, Robert Jenq, MD, an assistant professor in the departments of genomic medicine and stem cell transplantation and cellular therapy at the University of Texas M.D. Anderson Cancer Center, Houston, noted that with the growing evidence of the effects of poor gut microbiota diversity on clinical outcomes, multiple early-phase clinical trials are being conducted to test various strategies to prevent or treat gut injury.

“I’m not aware of any one approach that has shown enough promise to warrant being tested in multicenter studies yet, but it’s still a bit early,” Dr. Jenq said.“In the meantime, discontinuing or de-escalating antibiotics when medically safe, and encouraging patients to eat as much as they’re able to is a reasonable recommendation.”

Dr. Jenq added that, with most of the data on the issue being retrospective, a causative role has not been established, and “the finding of an association between the gut microbiota composition and survival, while interesting and provocative, does not provide evidence that intervening on the gut microbiota will lead to a clinical benefit.”

“I’m hopeful that randomized clinical trials will eventually demonstrate that we can protect or restore the gut microbiota, and this will lead to substantial clinical benefits, but this remains to be seen,” he said.

The authors had no disclosures to report. Dr. Jenq is an advisor for Seres Therapeutics, Prolacta Biosciences, and MaaT Pharma.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM BLOOD

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Pig kidneys show ‘life-sustaining’ function in human

Article Type
Changed
Thu, 08/24/2023 - 19:23

A pair of genetically modified pig kidneys filtered blood and produced urine for 7 days after being transplanted into a brain-dead patient – marking another important step toward opening up a new supply of much-needed organs for those with end-stage kidney disease.

A team of researchers in Alabama removed a brain-dead person’s kidneys and transplanted two kidneys that had been taken from a genetically modified pig. The researchers monitored the patient’s response to the organs and tracked the kidneys’ function over a 7-day period. The findings were published in JAMA Surgery.

During the first 24 hours after transplantation, the pig kidneys made more than 37 liters of urine. “It was really a remarkable thing to see,” lead investigator Jayme Locke, MD, professor of surgery and the Arnold G. Diethelm Endowed Chair in Transplantation Surgery, University of Alabama at Birmingham, said in a press release.

The recipient was given standard maintenance immunosuppression - tacrolimus, mycophenolate mofetil, and prednisone. The target tacrolimus level (8-10 ng/dL) was reached by postoperative day 2 and was maintained through study completion.

At the end of the study, the serum creatinine level was 0.9 mg/dL, and creatinine clearance was 200 mL/min. Creatinine levels are an indicator of kidney function and demonstrate the organ’s ability to filter waste from blood, according to Roger Lord, PhD, senior lecturer (medical sciences) in the School of Behavioural and Health Sciences, Australian Catholic University, who was not involved in the research.

This is the first time that it has been demonstrated that a standard immunosuppression regimen may be sufficient to support xenotransplantation with pig kidneys and in which creatinine clearance was achieved.

The finding comes less than 2 years after the same team published results from a similar experiment. In that transplant, the investigators didn’t observe significant creatinine excretion into the urine.

In the team’s previous attempts, kidney function was delayed because the brain-dead recipients had deteriorated physiologically. This time, the subject was stable, and the team was able to observe urine production within 4 minutes of restoration of blood flow to the transplanted pig organs.

“This new work firmly establishes that the xenografts not only make urine but provide life-sustaining kidney function by clearing serum creatinine,” Locke said in an interview. “This is the first time in history this has been shown.”

The investigators are hoping animal-sourced organs could become an alternative for human transplantations, potentially solving the serious shortage of human organs available for patients on transplant waiting lists.

Organ transplantation can treat patients with advanced kidney disease and kidney failure, but there are not enough human organs available to meet the need. More than 92,000 people in the United States are waiting for a kidney, according to the American Kidney Fund.

Organ rejection is a risk with xenotransplants – animal-to-human organ transplants. Investigators in this study used kidneys from pigs with 10 gene modifications. The modifications were intended to decrease the likelihood of the organs being rejected by a human host.

The kidneys were still viable at the end of the 7-day period. In addition, there was no microscopic blood clot formation, another indicator of normal kidney function, according to Dr. Lord, who provided comments to the UK Science Media Centre.

The long-term outcomes of animal-to-human organ transplantation remain unclear. Dr. Lord describes the operation as a “first step” to demonstrate that genetically modified, transplanted pig kidneys can function normally so as to remove creatinine over a 7-day period.

Dr. Locke and colleagues said: “Future research in living human recipients is necessary to determine long-term xenograft kidney function and whether xenografts could serve as a bridge or destination therapy for end-stage kidney disease.

“Because our study represents a single case, generalizability of the findings is limited. This study showcases xenotransplant as a viable potential solution to an organ shortage crisis responsible for thousands of preventable deaths annually,” they concluded.

A version of this article first appeared on Medscape.com .

Publications
Topics
Sections

A pair of genetically modified pig kidneys filtered blood and produced urine for 7 days after being transplanted into a brain-dead patient – marking another important step toward opening up a new supply of much-needed organs for those with end-stage kidney disease.

A team of researchers in Alabama removed a brain-dead person’s kidneys and transplanted two kidneys that had been taken from a genetically modified pig. The researchers monitored the patient’s response to the organs and tracked the kidneys’ function over a 7-day period. The findings were published in JAMA Surgery.

During the first 24 hours after transplantation, the pig kidneys made more than 37 liters of urine. “It was really a remarkable thing to see,” lead investigator Jayme Locke, MD, professor of surgery and the Arnold G. Diethelm Endowed Chair in Transplantation Surgery, University of Alabama at Birmingham, said in a press release.

The recipient was given standard maintenance immunosuppression - tacrolimus, mycophenolate mofetil, and prednisone. The target tacrolimus level (8-10 ng/dL) was reached by postoperative day 2 and was maintained through study completion.

At the end of the study, the serum creatinine level was 0.9 mg/dL, and creatinine clearance was 200 mL/min. Creatinine levels are an indicator of kidney function and demonstrate the organ’s ability to filter waste from blood, according to Roger Lord, PhD, senior lecturer (medical sciences) in the School of Behavioural and Health Sciences, Australian Catholic University, who was not involved in the research.

This is the first time that it has been demonstrated that a standard immunosuppression regimen may be sufficient to support xenotransplantation with pig kidneys and in which creatinine clearance was achieved.

The finding comes less than 2 years after the same team published results from a similar experiment. In that transplant, the investigators didn’t observe significant creatinine excretion into the urine.

In the team’s previous attempts, kidney function was delayed because the brain-dead recipients had deteriorated physiologically. This time, the subject was stable, and the team was able to observe urine production within 4 minutes of restoration of blood flow to the transplanted pig organs.

“This new work firmly establishes that the xenografts not only make urine but provide life-sustaining kidney function by clearing serum creatinine,” Locke said in an interview. “This is the first time in history this has been shown.”

The investigators are hoping animal-sourced organs could become an alternative for human transplantations, potentially solving the serious shortage of human organs available for patients on transplant waiting lists.

Organ transplantation can treat patients with advanced kidney disease and kidney failure, but there are not enough human organs available to meet the need. More than 92,000 people in the United States are waiting for a kidney, according to the American Kidney Fund.

Organ rejection is a risk with xenotransplants – animal-to-human organ transplants. Investigators in this study used kidneys from pigs with 10 gene modifications. The modifications were intended to decrease the likelihood of the organs being rejected by a human host.

The kidneys were still viable at the end of the 7-day period. In addition, there was no microscopic blood clot formation, another indicator of normal kidney function, according to Dr. Lord, who provided comments to the UK Science Media Centre.

The long-term outcomes of animal-to-human organ transplantation remain unclear. Dr. Lord describes the operation as a “first step” to demonstrate that genetically modified, transplanted pig kidneys can function normally so as to remove creatinine over a 7-day period.

Dr. Locke and colleagues said: “Future research in living human recipients is necessary to determine long-term xenograft kidney function and whether xenografts could serve as a bridge or destination therapy for end-stage kidney disease.

“Because our study represents a single case, generalizability of the findings is limited. This study showcases xenotransplant as a viable potential solution to an organ shortage crisis responsible for thousands of preventable deaths annually,” they concluded.

A version of this article first appeared on Medscape.com .

A pair of genetically modified pig kidneys filtered blood and produced urine for 7 days after being transplanted into a brain-dead patient – marking another important step toward opening up a new supply of much-needed organs for those with end-stage kidney disease.

A team of researchers in Alabama removed a brain-dead person’s kidneys and transplanted two kidneys that had been taken from a genetically modified pig. The researchers monitored the patient’s response to the organs and tracked the kidneys’ function over a 7-day period. The findings were published in JAMA Surgery.

During the first 24 hours after transplantation, the pig kidneys made more than 37 liters of urine. “It was really a remarkable thing to see,” lead investigator Jayme Locke, MD, professor of surgery and the Arnold G. Diethelm Endowed Chair in Transplantation Surgery, University of Alabama at Birmingham, said in a press release.

The recipient was given standard maintenance immunosuppression - tacrolimus, mycophenolate mofetil, and prednisone. The target tacrolimus level (8-10 ng/dL) was reached by postoperative day 2 and was maintained through study completion.

At the end of the study, the serum creatinine level was 0.9 mg/dL, and creatinine clearance was 200 mL/min. Creatinine levels are an indicator of kidney function and demonstrate the organ’s ability to filter waste from blood, according to Roger Lord, PhD, senior lecturer (medical sciences) in the School of Behavioural and Health Sciences, Australian Catholic University, who was not involved in the research.

This is the first time that it has been demonstrated that a standard immunosuppression regimen may be sufficient to support xenotransplantation with pig kidneys and in which creatinine clearance was achieved.

The finding comes less than 2 years after the same team published results from a similar experiment. In that transplant, the investigators didn’t observe significant creatinine excretion into the urine.

In the team’s previous attempts, kidney function was delayed because the brain-dead recipients had deteriorated physiologically. This time, the subject was stable, and the team was able to observe urine production within 4 minutes of restoration of blood flow to the transplanted pig organs.

“This new work firmly establishes that the xenografts not only make urine but provide life-sustaining kidney function by clearing serum creatinine,” Locke said in an interview. “This is the first time in history this has been shown.”

The investigators are hoping animal-sourced organs could become an alternative for human transplantations, potentially solving the serious shortage of human organs available for patients on transplant waiting lists.

Organ transplantation can treat patients with advanced kidney disease and kidney failure, but there are not enough human organs available to meet the need. More than 92,000 people in the United States are waiting for a kidney, according to the American Kidney Fund.

Organ rejection is a risk with xenotransplants – animal-to-human organ transplants. Investigators in this study used kidneys from pigs with 10 gene modifications. The modifications were intended to decrease the likelihood of the organs being rejected by a human host.

The kidneys were still viable at the end of the 7-day period. In addition, there was no microscopic blood clot formation, another indicator of normal kidney function, according to Dr. Lord, who provided comments to the UK Science Media Centre.

The long-term outcomes of animal-to-human organ transplantation remain unclear. Dr. Lord describes the operation as a “first step” to demonstrate that genetically modified, transplanted pig kidneys can function normally so as to remove creatinine over a 7-day period.

Dr. Locke and colleagues said: “Future research in living human recipients is necessary to determine long-term xenograft kidney function and whether xenografts could serve as a bridge or destination therapy for end-stage kidney disease.

“Because our study represents a single case, generalizability of the findings is limited. This study showcases xenotransplant as a viable potential solution to an organ shortage crisis responsible for thousands of preventable deaths annually,” they concluded.

A version of this article first appeared on Medscape.com .

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA SURGERY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Better than dialysis? Artificial kidney could be the future

Article Type
Changed
Thu, 08/24/2023 - 19:22

Nearly 90,000 patients in the United States are waiting for a lifesaving kidney transplant, yet only about 25,000 kidney transplants were performed last year. Thousands die each year while they wait. Others are not suitable transplant candidates.

Half a million people are on dialysis, the only transplant alternative for those with kidney failure. This greatly impacts their work, relationships, and quality of life.

Researchers from The Kidney Project hope to solve this public health crisis with a futuristic approach: an implantable bioartificial kidney. That hope is slowly approaching reality. Early prototypes have been tested successfully in preclinical research and clinical trials could lie ahead.

This news organization spoke with two researchers who came up with the idea: nephrologist William Dr. Fissell, MD, of Vanderbilt University in Nashville, Tenn., and Shuvo Dr. Roy, PhD, a biomedical engineer at the University of California, San Francisco. This interview has been edited for length and clarity.
 

Question: Could you summarize the clinical problem with chronic kidney disease?

Dr. Fissell:
Dialysis treatment, although lifesaving, is incomplete. Healthy kidneys do a variety of things that dialysis cannot provide. Transplant is absolutely the best remedy, but donor organs are vanishingly scarce. Our goal has been to develop a mass-produced, universal donor kidney to render the issue of scarcity – scarcity of time, scarcity of resources, scarcity of money, scarcity of donor organs – irrelevant.

Do you envision your implantable, bioartificial kidney as a bridge to transplantation? Or can it be even more, like a bionic organ, as good as a natural organ and thus better than a transplant?

Dr. Roy:
We see it initially as a bridge to transplantation or as a better option than dialysis for those who will never get a transplant. We’re not trying to create the “Six Million Dollar Man.” The goal is to keep patients off dialysis – to deliver some, but probably not all, of the benefits of a kidney transplant in a mass-produced device that anybody can receive.

Dr. Fissell: The technology is aimed at people in stage 5 renal disease, the final stage, when kidneys are failing, and dialysis is the only option to maintain life. We want to make dialysis a thing of the past, put dialysis machines in museums like the iron lung, which was so vital to keeping people alive several decades ago but is mostly obsolete today.

How did you two come up with this idea? How did you get started working together?

Dr. Roy:
I had just begun my career as a research biomedical engineer when I met Dr. William Fissell, who was then contemplating a career in nephrology. He opened my eyes to the problems faced by patients affected by kidney failure. Through our discussions, we quickly realized that while we could improve dialysis machines, patients needed and deserved something better – a treatment that improves their health while also allowing them to keep a job, travel readily, and consume food and drink without restrictions. Basically, something that works more like a kidney transplant.

How does the artificial kidney differ from dialysis?

Dr. Fissell:
Dialysis is an intermittent stop-and-start treatment. The artificial kidney is continuous, around-the-clock treatment. There are a couple of advantages to that. The first is that you can maintain your body’s fluid balance. In dialysis, you get rid of 2-3 days’ worth of fluid in a couple of hours, and that’s very stressful to the heart and maybe to the brain as well. Second advantage is that patients will be able to eat a normal diet. Some waste products that are byproducts of our nutritional intake are slow to leave the body. So in dialysis, we restrict the diet severely and add medicines to soak up extra phosphorus. With a continuous treatment, you can balance excretion and intake.

The other aspect is that dialysis requires an immense amount of disposables. Hundreds of liters of water per patient per treatment, hundreds of thousands of dialysis cartridges and IV bags every year. The artificial kidney doesn’t need a water supply, disposable sorbent, or cartridges.
 

How does the artificial kidney work?

Dr. Fissell:
Just like a healthy kidney. We have a unit that filters the blood so that red blood cells, white blood cells, platelets, antibodies, albumin – all the good stuff that your body worked hard to synthesize – stays in the blood, but a watery soup of toxins and waste is separated out. In a second unit, called the bioreactor, kidney cells concentrate those wastes and toxins into urine.

Dr. Roy: We used a technology called silicon micro-machining to invent an entirely new membrane that mimics a healthy kidney’s filters. It filters the blood just using the patient’s heart as a pump. No electric motors, no batteries, no wires. This lets us have something that’s completely implanted.

We also developed a cell culture of kidney cells that function in an artificial kidney. Normally, cells in a dish don’t fully adopt the features of a cell in the body. We looked at the literature around 3-D printing of organs. We learned that, in addition to fluid flow, stiff scaffolds, like cell culture dishes, trigger specific signals that keep the cells from functioning. We overcame that by looking at the physical microenvironment of the cells –  not the hormones and proteins, but instead the fundamentals of the laboratory environment. For example, most organs are soft, yet plastic lab dishes are hard. By using tools that replicated the softness and fluid flow of a healthy kidney, remarkably, these cells functioned better than on a plastic dish.
 

Would patients need immunosuppressive or anticoagulation medication?

Dr. Fissell:
They wouldn’t need either. The structure and chemistry of the device prevents blood from clotting. And the membranes in the device are a physical barrier between the host immune system and the donor cells, so the body won’t reject the device.

What is the state of the technology now?

Dr. Fissell:
We have shown the function of the filters and the function of the cells, both separately and together, in preclinical in vivo testing. What we now need to do is construct clinical-grade devices and complete sterility and biocompatibility testing to initiate a human trial. That’s going to take between $12 million and $15 million in device manufacturing.

So it’s more a matter of money than time until the first clinical trials?

Dr. Roy: Yes, exactly. We don’t like to say that a clinical trial will start by such-and-such year. From the very start of the project, we have been resource limited.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Nearly 90,000 patients in the United States are waiting for a lifesaving kidney transplant, yet only about 25,000 kidney transplants were performed last year. Thousands die each year while they wait. Others are not suitable transplant candidates.

Half a million people are on dialysis, the only transplant alternative for those with kidney failure. This greatly impacts their work, relationships, and quality of life.

Researchers from The Kidney Project hope to solve this public health crisis with a futuristic approach: an implantable bioartificial kidney. That hope is slowly approaching reality. Early prototypes have been tested successfully in preclinical research and clinical trials could lie ahead.

This news organization spoke with two researchers who came up with the idea: nephrologist William Dr. Fissell, MD, of Vanderbilt University in Nashville, Tenn., and Shuvo Dr. Roy, PhD, a biomedical engineer at the University of California, San Francisco. This interview has been edited for length and clarity.
 

Question: Could you summarize the clinical problem with chronic kidney disease?

Dr. Fissell:
Dialysis treatment, although lifesaving, is incomplete. Healthy kidneys do a variety of things that dialysis cannot provide. Transplant is absolutely the best remedy, but donor organs are vanishingly scarce. Our goal has been to develop a mass-produced, universal donor kidney to render the issue of scarcity – scarcity of time, scarcity of resources, scarcity of money, scarcity of donor organs – irrelevant.

Do you envision your implantable, bioartificial kidney as a bridge to transplantation? Or can it be even more, like a bionic organ, as good as a natural organ and thus better than a transplant?

Dr. Roy:
We see it initially as a bridge to transplantation or as a better option than dialysis for those who will never get a transplant. We’re not trying to create the “Six Million Dollar Man.” The goal is to keep patients off dialysis – to deliver some, but probably not all, of the benefits of a kidney transplant in a mass-produced device that anybody can receive.

Dr. Fissell: The technology is aimed at people in stage 5 renal disease, the final stage, when kidneys are failing, and dialysis is the only option to maintain life. We want to make dialysis a thing of the past, put dialysis machines in museums like the iron lung, which was so vital to keeping people alive several decades ago but is mostly obsolete today.

How did you two come up with this idea? How did you get started working together?

Dr. Roy:
I had just begun my career as a research biomedical engineer when I met Dr. William Fissell, who was then contemplating a career in nephrology. He opened my eyes to the problems faced by patients affected by kidney failure. Through our discussions, we quickly realized that while we could improve dialysis machines, patients needed and deserved something better – a treatment that improves their health while also allowing them to keep a job, travel readily, and consume food and drink without restrictions. Basically, something that works more like a kidney transplant.

How does the artificial kidney differ from dialysis?

Dr. Fissell:
Dialysis is an intermittent stop-and-start treatment. The artificial kidney is continuous, around-the-clock treatment. There are a couple of advantages to that. The first is that you can maintain your body’s fluid balance. In dialysis, you get rid of 2-3 days’ worth of fluid in a couple of hours, and that’s very stressful to the heart and maybe to the brain as well. Second advantage is that patients will be able to eat a normal diet. Some waste products that are byproducts of our nutritional intake are slow to leave the body. So in dialysis, we restrict the diet severely and add medicines to soak up extra phosphorus. With a continuous treatment, you can balance excretion and intake.

The other aspect is that dialysis requires an immense amount of disposables. Hundreds of liters of water per patient per treatment, hundreds of thousands of dialysis cartridges and IV bags every year. The artificial kidney doesn’t need a water supply, disposable sorbent, or cartridges.
 

How does the artificial kidney work?

Dr. Fissell:
Just like a healthy kidney. We have a unit that filters the blood so that red blood cells, white blood cells, platelets, antibodies, albumin – all the good stuff that your body worked hard to synthesize – stays in the blood, but a watery soup of toxins and waste is separated out. In a second unit, called the bioreactor, kidney cells concentrate those wastes and toxins into urine.

Dr. Roy: We used a technology called silicon micro-machining to invent an entirely new membrane that mimics a healthy kidney’s filters. It filters the blood just using the patient’s heart as a pump. No electric motors, no batteries, no wires. This lets us have something that’s completely implanted.

We also developed a cell culture of kidney cells that function in an artificial kidney. Normally, cells in a dish don’t fully adopt the features of a cell in the body. We looked at the literature around 3-D printing of organs. We learned that, in addition to fluid flow, stiff scaffolds, like cell culture dishes, trigger specific signals that keep the cells from functioning. We overcame that by looking at the physical microenvironment of the cells –  not the hormones and proteins, but instead the fundamentals of the laboratory environment. For example, most organs are soft, yet plastic lab dishes are hard. By using tools that replicated the softness and fluid flow of a healthy kidney, remarkably, these cells functioned better than on a plastic dish.
 

Would patients need immunosuppressive or anticoagulation medication?

Dr. Fissell:
They wouldn’t need either. The structure and chemistry of the device prevents blood from clotting. And the membranes in the device are a physical barrier between the host immune system and the donor cells, so the body won’t reject the device.

What is the state of the technology now?

Dr. Fissell:
We have shown the function of the filters and the function of the cells, both separately and together, in preclinical in vivo testing. What we now need to do is construct clinical-grade devices and complete sterility and biocompatibility testing to initiate a human trial. That’s going to take between $12 million and $15 million in device manufacturing.

So it’s more a matter of money than time until the first clinical trials?

Dr. Roy: Yes, exactly. We don’t like to say that a clinical trial will start by such-and-such year. From the very start of the project, we have been resource limited.

A version of this article first appeared on Medscape.com.

Nearly 90,000 patients in the United States are waiting for a lifesaving kidney transplant, yet only about 25,000 kidney transplants were performed last year. Thousands die each year while they wait. Others are not suitable transplant candidates.

Half a million people are on dialysis, the only transplant alternative for those with kidney failure. This greatly impacts their work, relationships, and quality of life.

Researchers from The Kidney Project hope to solve this public health crisis with a futuristic approach: an implantable bioartificial kidney. That hope is slowly approaching reality. Early prototypes have been tested successfully in preclinical research and clinical trials could lie ahead.

This news organization spoke with two researchers who came up with the idea: nephrologist William Dr. Fissell, MD, of Vanderbilt University in Nashville, Tenn., and Shuvo Dr. Roy, PhD, a biomedical engineer at the University of California, San Francisco. This interview has been edited for length and clarity.
 

Question: Could you summarize the clinical problem with chronic kidney disease?

Dr. Fissell:
Dialysis treatment, although lifesaving, is incomplete. Healthy kidneys do a variety of things that dialysis cannot provide. Transplant is absolutely the best remedy, but donor organs are vanishingly scarce. Our goal has been to develop a mass-produced, universal donor kidney to render the issue of scarcity – scarcity of time, scarcity of resources, scarcity of money, scarcity of donor organs – irrelevant.

Do you envision your implantable, bioartificial kidney as a bridge to transplantation? Or can it be even more, like a bionic organ, as good as a natural organ and thus better than a transplant?

Dr. Roy:
We see it initially as a bridge to transplantation or as a better option than dialysis for those who will never get a transplant. We’re not trying to create the “Six Million Dollar Man.” The goal is to keep patients off dialysis – to deliver some, but probably not all, of the benefits of a kidney transplant in a mass-produced device that anybody can receive.

Dr. Fissell: The technology is aimed at people in stage 5 renal disease, the final stage, when kidneys are failing, and dialysis is the only option to maintain life. We want to make dialysis a thing of the past, put dialysis machines in museums like the iron lung, which was so vital to keeping people alive several decades ago but is mostly obsolete today.

How did you two come up with this idea? How did you get started working together?

Dr. Roy:
I had just begun my career as a research biomedical engineer when I met Dr. William Fissell, who was then contemplating a career in nephrology. He opened my eyes to the problems faced by patients affected by kidney failure. Through our discussions, we quickly realized that while we could improve dialysis machines, patients needed and deserved something better – a treatment that improves their health while also allowing them to keep a job, travel readily, and consume food and drink without restrictions. Basically, something that works more like a kidney transplant.

How does the artificial kidney differ from dialysis?

Dr. Fissell:
Dialysis is an intermittent stop-and-start treatment. The artificial kidney is continuous, around-the-clock treatment. There are a couple of advantages to that. The first is that you can maintain your body’s fluid balance. In dialysis, you get rid of 2-3 days’ worth of fluid in a couple of hours, and that’s very stressful to the heart and maybe to the brain as well. Second advantage is that patients will be able to eat a normal diet. Some waste products that are byproducts of our nutritional intake are slow to leave the body. So in dialysis, we restrict the diet severely and add medicines to soak up extra phosphorus. With a continuous treatment, you can balance excretion and intake.

The other aspect is that dialysis requires an immense amount of disposables. Hundreds of liters of water per patient per treatment, hundreds of thousands of dialysis cartridges and IV bags every year. The artificial kidney doesn’t need a water supply, disposable sorbent, or cartridges.
 

How does the artificial kidney work?

Dr. Fissell:
Just like a healthy kidney. We have a unit that filters the blood so that red blood cells, white blood cells, platelets, antibodies, albumin – all the good stuff that your body worked hard to synthesize – stays in the blood, but a watery soup of toxins and waste is separated out. In a second unit, called the bioreactor, kidney cells concentrate those wastes and toxins into urine.

Dr. Roy: We used a technology called silicon micro-machining to invent an entirely new membrane that mimics a healthy kidney’s filters. It filters the blood just using the patient’s heart as a pump. No electric motors, no batteries, no wires. This lets us have something that’s completely implanted.

We also developed a cell culture of kidney cells that function in an artificial kidney. Normally, cells in a dish don’t fully adopt the features of a cell in the body. We looked at the literature around 3-D printing of organs. We learned that, in addition to fluid flow, stiff scaffolds, like cell culture dishes, trigger specific signals that keep the cells from functioning. We overcame that by looking at the physical microenvironment of the cells –  not the hormones and proteins, but instead the fundamentals of the laboratory environment. For example, most organs are soft, yet plastic lab dishes are hard. By using tools that replicated the softness and fluid flow of a healthy kidney, remarkably, these cells functioned better than on a plastic dish.
 

Would patients need immunosuppressive or anticoagulation medication?

Dr. Fissell:
They wouldn’t need either. The structure and chemistry of the device prevents blood from clotting. And the membranes in the device are a physical barrier between the host immune system and the donor cells, so the body won’t reject the device.

What is the state of the technology now?

Dr. Fissell:
We have shown the function of the filters and the function of the cells, both separately and together, in preclinical in vivo testing. What we now need to do is construct clinical-grade devices and complete sterility and biocompatibility testing to initiate a human trial. That’s going to take between $12 million and $15 million in device manufacturing.

So it’s more a matter of money than time until the first clinical trials?

Dr. Roy: Yes, exactly. We don’t like to say that a clinical trial will start by such-and-such year. From the very start of the project, we have been resource limited.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Antibody shows promise in preventing GVHD

Article Type
Changed
Tue, 08/08/2023 - 11:50

Early, intriguing research suggests that preventing acute graft-versus-host disease (GVHD) in the gut – a potentially life-threatening complication of allogeneic hematopoietic cell transplantation (allo-HCT) – could be accomplished by the administration of a single antibody that targets the anti-DLL4 Notch signaling pathway, without compromising the stem cell transplant.

“The major surprise was that none of the anti–DLL4-treated animals developed acute gastrointestinal GVHD for the entire duration of the study. This was a remarkable finding, given that intestinal GVHD is otherwise seen in the vast majority of nonhuman primate transplant recipients that receive either no prophylaxis, or prophylaxis with agents other than anti-DLL4 antibodies,” co–senior author Ivan Maillard, MD, PhD, a professor of medicine and vice chief for research in hematology-oncology at the University of Pennsylvania, Philadelphia, said in an interview.

“The timing was critical,” the authors noted in the study, recently published in Science Translational Medicine. “Intervening before any symptoms of GvHD appear made the long-term protection possible.”

While GVHD may be mild to moderate in chronic forms, acute cases can be serious, if not fatal, and nearly all severe acute GVHD prominently involves the gastrointestinal tract, which can drive activation of pathogenic T cells and potentially lead to tissue damage following allo-HCT.

Systemic corticosteroids are standard first-line treatment for acute GVHD. However, response rates generally range only from 40% to 60%, and there are concerns of side effects. Meanwhile, second-line treatments are of inconsistent benefit.

With previous studies on mice showing benefits of targeting Notch pathway inhibition, particularly DLL4, Dr. Maillard and colleagues further investigated the effects in nonhuman primates that were allo-HCT recipients, using the anti-DLL4 antibody REGN421, which has pharmacokinetic and toxicity information available from previous studies.

The nonhuman primates were treated with one of two dosing regimens: a single dose of REGN421 3 mg/kg at baseline, post HCT, (n = 7) or three weekly doses at days 0, 7 and 14, post transplant (n = 4). Those primates were compared with 11 primates receiving allo-HCT transplants that received supportive care only.

Primates receiving three weekly doses of REGN421 showed antibody concentrations of greater than 2 mcg/mL for more than 30 days post HCT. A single dose of REGN421 was associated with protection from acute GVHD at day 0, while three weekly doses showed protection at day 0, 7, and 14, consistent with an impact of REGN421 during the early phases of T-cell activation.

Compared with animals receiving only supportive care, prophylaxis with REGN421 was associated with delayed acute GVHD onset and lengthened survival.

Of the 11 primates treated with REGN421, none developed clinical signs of gastrointestinal acute GVHD, whereas the majority of those receiving standard care or other preventive interventions did.

“Detailed analysis of acute GVHD clinical presentations in REGN421-treated animals in comparison to no treatment controls revealed near complete protection from GI-acute GvHD with REGN421,” the authors reported.

Furthermore, pathology scores in the gastrointestinal tract were lower with REGN421 treatment, compared with the no-treatment cohort, and the scores matched those of healthy nontransplanted nonhuman primates.

The primates treated with REGN421 did ultimately develop other clinical and pathologic signs of skin, hepatic or pulmonary acute GVHD, but without gastrointestinal disease.

The treatment was not associated with any adverse effects on the allo-HCT, with primates receiving either a single dose or three weekly doses of REGN421 showing rapid donor engraftment after allo-HCT, including high bone marrow, whole blood, and T-cell donor chimerism.

“Reassuringly, short-term systemic DLL4 blockade with REGN421 did not trigger unexpected side effects in our nonhuman primate model, while preserving rapid engraftment as well hematopoietic and immune reconstitution.”

The mechanism preserving the engraftment, described as a “major surprise,” specifically involved DLL4 inhibition blocking the homing of pathogenic T cells to the gut while preserving homing of regulatory T cells that dampen the immune response, Dr. Maillard explained.

“This effect turned out to be at least in part through a posttranslational effect of DLL4/Notch blockade on integrin pairing at the T-cell surface,” he explained. “This was a novel and quite unexpected mechanism of action conserved from mice to nonhuman primates.”

The results are encouraging in terms of translating to humans because of their closer similarities in various physiological factors, Dr. Maillard said.

“The nonhuman primate model of transplantation [offers] a transplantation model very close to what is being performed in humans, as well as the opportunity to study an immune system very similar to that of humans in nonhuman primates,” he said.

Dr. Maillard noted that, while trials in humans are not underway yet, “we are in active discussions about it,” and the team is indeed interested in testing REGN421 itself, with the effects likely to be as a prophylactic strategy.

There are currently no approved anti-DLL4 antibody drugs for use in humans.

“Our approach is mostly promising as a preventive treatment, rather than as a secondary treatment for GVHD, because DLL4/Notch blockade seems most active when applied early after transplantation during the time of initial seeding of the gut by T cells (in mice, we had observed the critical time window for a successful intervention to be within 48 hours of transplantation),” Dr. Maillard said.“There remain questions about which other prophylactic treatments we should ideally combine anti-DLL4 antibodies with.”

Dr. Maillard has received research funding from Regeneron and Genentech and is a member of Garuda Therapeutics’s scientific advisory board.

Publications
Topics
Sections

Early, intriguing research suggests that preventing acute graft-versus-host disease (GVHD) in the gut – a potentially life-threatening complication of allogeneic hematopoietic cell transplantation (allo-HCT) – could be accomplished by the administration of a single antibody that targets the anti-DLL4 Notch signaling pathway, without compromising the stem cell transplant.

“The major surprise was that none of the anti–DLL4-treated animals developed acute gastrointestinal GVHD for the entire duration of the study. This was a remarkable finding, given that intestinal GVHD is otherwise seen in the vast majority of nonhuman primate transplant recipients that receive either no prophylaxis, or prophylaxis with agents other than anti-DLL4 antibodies,” co–senior author Ivan Maillard, MD, PhD, a professor of medicine and vice chief for research in hematology-oncology at the University of Pennsylvania, Philadelphia, said in an interview.

“The timing was critical,” the authors noted in the study, recently published in Science Translational Medicine. “Intervening before any symptoms of GvHD appear made the long-term protection possible.”

While GVHD may be mild to moderate in chronic forms, acute cases can be serious, if not fatal, and nearly all severe acute GVHD prominently involves the gastrointestinal tract, which can drive activation of pathogenic T cells and potentially lead to tissue damage following allo-HCT.

Systemic corticosteroids are standard first-line treatment for acute GVHD. However, response rates generally range only from 40% to 60%, and there are concerns of side effects. Meanwhile, second-line treatments are of inconsistent benefit.

With previous studies on mice showing benefits of targeting Notch pathway inhibition, particularly DLL4, Dr. Maillard and colleagues further investigated the effects in nonhuman primates that were allo-HCT recipients, using the anti-DLL4 antibody REGN421, which has pharmacokinetic and toxicity information available from previous studies.

The nonhuman primates were treated with one of two dosing regimens: a single dose of REGN421 3 mg/kg at baseline, post HCT, (n = 7) or three weekly doses at days 0, 7 and 14, post transplant (n = 4). Those primates were compared with 11 primates receiving allo-HCT transplants that received supportive care only.

Primates receiving three weekly doses of REGN421 showed antibody concentrations of greater than 2 mcg/mL for more than 30 days post HCT. A single dose of REGN421 was associated with protection from acute GVHD at day 0, while three weekly doses showed protection at day 0, 7, and 14, consistent with an impact of REGN421 during the early phases of T-cell activation.

Compared with animals receiving only supportive care, prophylaxis with REGN421 was associated with delayed acute GVHD onset and lengthened survival.

Of the 11 primates treated with REGN421, none developed clinical signs of gastrointestinal acute GVHD, whereas the majority of those receiving standard care or other preventive interventions did.

“Detailed analysis of acute GVHD clinical presentations in REGN421-treated animals in comparison to no treatment controls revealed near complete protection from GI-acute GvHD with REGN421,” the authors reported.

Furthermore, pathology scores in the gastrointestinal tract were lower with REGN421 treatment, compared with the no-treatment cohort, and the scores matched those of healthy nontransplanted nonhuman primates.

The primates treated with REGN421 did ultimately develop other clinical and pathologic signs of skin, hepatic or pulmonary acute GVHD, but without gastrointestinal disease.

The treatment was not associated with any adverse effects on the allo-HCT, with primates receiving either a single dose or three weekly doses of REGN421 showing rapid donor engraftment after allo-HCT, including high bone marrow, whole blood, and T-cell donor chimerism.

“Reassuringly, short-term systemic DLL4 blockade with REGN421 did not trigger unexpected side effects in our nonhuman primate model, while preserving rapid engraftment as well hematopoietic and immune reconstitution.”

The mechanism preserving the engraftment, described as a “major surprise,” specifically involved DLL4 inhibition blocking the homing of pathogenic T cells to the gut while preserving homing of regulatory T cells that dampen the immune response, Dr. Maillard explained.

“This effect turned out to be at least in part through a posttranslational effect of DLL4/Notch blockade on integrin pairing at the T-cell surface,” he explained. “This was a novel and quite unexpected mechanism of action conserved from mice to nonhuman primates.”

The results are encouraging in terms of translating to humans because of their closer similarities in various physiological factors, Dr. Maillard said.

“The nonhuman primate model of transplantation [offers] a transplantation model very close to what is being performed in humans, as well as the opportunity to study an immune system very similar to that of humans in nonhuman primates,” he said.

Dr. Maillard noted that, while trials in humans are not underway yet, “we are in active discussions about it,” and the team is indeed interested in testing REGN421 itself, with the effects likely to be as a prophylactic strategy.

There are currently no approved anti-DLL4 antibody drugs for use in humans.

“Our approach is mostly promising as a preventive treatment, rather than as a secondary treatment for GVHD, because DLL4/Notch blockade seems most active when applied early after transplantation during the time of initial seeding of the gut by T cells (in mice, we had observed the critical time window for a successful intervention to be within 48 hours of transplantation),” Dr. Maillard said.“There remain questions about which other prophylactic treatments we should ideally combine anti-DLL4 antibodies with.”

Dr. Maillard has received research funding from Regeneron and Genentech and is a member of Garuda Therapeutics’s scientific advisory board.

Early, intriguing research suggests that preventing acute graft-versus-host disease (GVHD) in the gut – a potentially life-threatening complication of allogeneic hematopoietic cell transplantation (allo-HCT) – could be accomplished by the administration of a single antibody that targets the anti-DLL4 Notch signaling pathway, without compromising the stem cell transplant.

“The major surprise was that none of the anti–DLL4-treated animals developed acute gastrointestinal GVHD for the entire duration of the study. This was a remarkable finding, given that intestinal GVHD is otherwise seen in the vast majority of nonhuman primate transplant recipients that receive either no prophylaxis, or prophylaxis with agents other than anti-DLL4 antibodies,” co–senior author Ivan Maillard, MD, PhD, a professor of medicine and vice chief for research in hematology-oncology at the University of Pennsylvania, Philadelphia, said in an interview.

“The timing was critical,” the authors noted in the study, recently published in Science Translational Medicine. “Intervening before any symptoms of GvHD appear made the long-term protection possible.”

While GVHD may be mild to moderate in chronic forms, acute cases can be serious, if not fatal, and nearly all severe acute GVHD prominently involves the gastrointestinal tract, which can drive activation of pathogenic T cells and potentially lead to tissue damage following allo-HCT.

Systemic corticosteroids are standard first-line treatment for acute GVHD. However, response rates generally range only from 40% to 60%, and there are concerns of side effects. Meanwhile, second-line treatments are of inconsistent benefit.

With previous studies on mice showing benefits of targeting Notch pathway inhibition, particularly DLL4, Dr. Maillard and colleagues further investigated the effects in nonhuman primates that were allo-HCT recipients, using the anti-DLL4 antibody REGN421, which has pharmacokinetic and toxicity information available from previous studies.

The nonhuman primates were treated with one of two dosing regimens: a single dose of REGN421 3 mg/kg at baseline, post HCT, (n = 7) or three weekly doses at days 0, 7 and 14, post transplant (n = 4). Those primates were compared with 11 primates receiving allo-HCT transplants that received supportive care only.

Primates receiving three weekly doses of REGN421 showed antibody concentrations of greater than 2 mcg/mL for more than 30 days post HCT. A single dose of REGN421 was associated with protection from acute GVHD at day 0, while three weekly doses showed protection at day 0, 7, and 14, consistent with an impact of REGN421 during the early phases of T-cell activation.

Compared with animals receiving only supportive care, prophylaxis with REGN421 was associated with delayed acute GVHD onset and lengthened survival.

Of the 11 primates treated with REGN421, none developed clinical signs of gastrointestinal acute GVHD, whereas the majority of those receiving standard care or other preventive interventions did.

“Detailed analysis of acute GVHD clinical presentations in REGN421-treated animals in comparison to no treatment controls revealed near complete protection from GI-acute GvHD with REGN421,” the authors reported.

Furthermore, pathology scores in the gastrointestinal tract were lower with REGN421 treatment, compared with the no-treatment cohort, and the scores matched those of healthy nontransplanted nonhuman primates.

The primates treated with REGN421 did ultimately develop other clinical and pathologic signs of skin, hepatic or pulmonary acute GVHD, but without gastrointestinal disease.

The treatment was not associated with any adverse effects on the allo-HCT, with primates receiving either a single dose or three weekly doses of REGN421 showing rapid donor engraftment after allo-HCT, including high bone marrow, whole blood, and T-cell donor chimerism.

“Reassuringly, short-term systemic DLL4 blockade with REGN421 did not trigger unexpected side effects in our nonhuman primate model, while preserving rapid engraftment as well hematopoietic and immune reconstitution.”

The mechanism preserving the engraftment, described as a “major surprise,” specifically involved DLL4 inhibition blocking the homing of pathogenic T cells to the gut while preserving homing of regulatory T cells that dampen the immune response, Dr. Maillard explained.

“This effect turned out to be at least in part through a posttranslational effect of DLL4/Notch blockade on integrin pairing at the T-cell surface,” he explained. “This was a novel and quite unexpected mechanism of action conserved from mice to nonhuman primates.”

The results are encouraging in terms of translating to humans because of their closer similarities in various physiological factors, Dr. Maillard said.

“The nonhuman primate model of transplantation [offers] a transplantation model very close to what is being performed in humans, as well as the opportunity to study an immune system very similar to that of humans in nonhuman primates,” he said.

Dr. Maillard noted that, while trials in humans are not underway yet, “we are in active discussions about it,” and the team is indeed interested in testing REGN421 itself, with the effects likely to be as a prophylactic strategy.

There are currently no approved anti-DLL4 antibody drugs for use in humans.

“Our approach is mostly promising as a preventive treatment, rather than as a secondary treatment for GVHD, because DLL4/Notch blockade seems most active when applied early after transplantation during the time of initial seeding of the gut by T cells (in mice, we had observed the critical time window for a successful intervention to be within 48 hours of transplantation),” Dr. Maillard said.“There remain questions about which other prophylactic treatments we should ideally combine anti-DLL4 antibodies with.”

Dr. Maillard has received research funding from Regeneron and Genentech and is a member of Garuda Therapeutics’s scientific advisory board.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM SCIENCE TRANSLATIONAL MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Neutropenia affects clinical presentation of pulmonary mucormycosis

Article Type
Changed
Tue, 08/08/2023 - 11:53

Neutropenia and radiological findings affected the presentation and diagnosis of pulmonary mucormycosis in adult patients, based on data from 114 individuals.

Diagnosis of pulmonary mucormycosis (PM), an invasive and potentially life-threatening fungal infection, is often delayed because of its variable presentation, wrote Anne Coste, MD, of La Cavale Blanche Hospital and Brest (France) University Hospital, and colleagues.

Improved diagnostic tools including molecular identification and image-guided lung biopsies are now available in many centers, but relations between underlying conditions, clinical presentations, and diagnostic methods have not been described, they said.

In a study published in the journal Chest, the researchers reviewed data from all cases of PM seen at six hospitals in France between 2008 and 2019. PM cases were based on European Organization for Research and Treatment of Cancer and the National Institute of Allergy and Infectious Diseases Mycoses Study Group (EORTC/MSG) criteria. Diabetes and trauma were included as additional host factors, and positive serum or tissue PCR (serum qPCR) were included as mycological evidence. Participants also underwent thoracic computed tomography (CT) scans.

The most common underlying conditions among the 114 patients were hematological malignancy (49%), allogeneic hematopoietic stem-cell transplantation (21%), and solid organ transplantation (17%).

Among the 40% of the cases that involved dissemination, the most common sites were the liver (48%), spleen (48%), brain (44%), and kidneys (37%).

A review of radiology findings showed consolidation in a majority of patients (58%), as well as pleural effusion (52%). Other findings included reversed halo sign (RHS, 26%), halo sign (24%), vascular abnormalities (26%), and cavity (23%).

Bronchoalveolar lavage (BAL) was present in 46 of 96 patients (50%), and transthoracic lung biopsy was used for diagnosis in 8 of 11 (73%) patients with previous negative BALs.

Seventy patients had neutropenia. Overall, patients with neutropenia were significantly more likely than were those without neutropenia to show an angioinvasive presentation that included both RHS and disease dissemination (P < .05).

In addition, serum qPCR was positive in 42 of 53 patients for whom data were available (79%). Serum qPCR was significantly more likely to be positive in neutropenic patients (91% vs. 62%, P = .02). Positive qPCR was associated with an early diagnosis (P = .03) and treatment onset (P = .01).

Possible reasons for the high rate of disseminated PM in the current study may be the large number of patients with pulmonary involvement, use of body CT data, and availability of autopsy results (for 11% of cases), the researchers wrote in their discussion.

Neutropenia and radiological findings influence disease presentation and contribution of diagnostic tools during PM. Serum qPCR is more contributive in neutropenic patients and BAL examination in nonneutropenic patients. Lung biopsies are highly contributive in case of non-contributive BAL.

The findings were limited by several factors including the retrospective design, the inability to calculate sensitivity and specificity of diagnostic methods, and lack of data on patients with COVID-19, the researchers noted. However, the results provide real-life information for clinicians in centers with current mycological platforms, they concluded.

The study received no outside funding. Dr. Coste had no financial conflicts to disclose.

Publications
Topics
Sections

Neutropenia and radiological findings affected the presentation and diagnosis of pulmonary mucormycosis in adult patients, based on data from 114 individuals.

Diagnosis of pulmonary mucormycosis (PM), an invasive and potentially life-threatening fungal infection, is often delayed because of its variable presentation, wrote Anne Coste, MD, of La Cavale Blanche Hospital and Brest (France) University Hospital, and colleagues.

Improved diagnostic tools including molecular identification and image-guided lung biopsies are now available in many centers, but relations between underlying conditions, clinical presentations, and diagnostic methods have not been described, they said.

In a study published in the journal Chest, the researchers reviewed data from all cases of PM seen at six hospitals in France between 2008 and 2019. PM cases were based on European Organization for Research and Treatment of Cancer and the National Institute of Allergy and Infectious Diseases Mycoses Study Group (EORTC/MSG) criteria. Diabetes and trauma were included as additional host factors, and positive serum or tissue PCR (serum qPCR) were included as mycological evidence. Participants also underwent thoracic computed tomography (CT) scans.

The most common underlying conditions among the 114 patients were hematological malignancy (49%), allogeneic hematopoietic stem-cell transplantation (21%), and solid organ transplantation (17%).

Among the 40% of the cases that involved dissemination, the most common sites were the liver (48%), spleen (48%), brain (44%), and kidneys (37%).

A review of radiology findings showed consolidation in a majority of patients (58%), as well as pleural effusion (52%). Other findings included reversed halo sign (RHS, 26%), halo sign (24%), vascular abnormalities (26%), and cavity (23%).

Bronchoalveolar lavage (BAL) was present in 46 of 96 patients (50%), and transthoracic lung biopsy was used for diagnosis in 8 of 11 (73%) patients with previous negative BALs.

Seventy patients had neutropenia. Overall, patients with neutropenia were significantly more likely than were those without neutropenia to show an angioinvasive presentation that included both RHS and disease dissemination (P < .05).

In addition, serum qPCR was positive in 42 of 53 patients for whom data were available (79%). Serum qPCR was significantly more likely to be positive in neutropenic patients (91% vs. 62%, P = .02). Positive qPCR was associated with an early diagnosis (P = .03) and treatment onset (P = .01).

Possible reasons for the high rate of disseminated PM in the current study may be the large number of patients with pulmonary involvement, use of body CT data, and availability of autopsy results (for 11% of cases), the researchers wrote in their discussion.

Neutropenia and radiological findings influence disease presentation and contribution of diagnostic tools during PM. Serum qPCR is more contributive in neutropenic patients and BAL examination in nonneutropenic patients. Lung biopsies are highly contributive in case of non-contributive BAL.

The findings were limited by several factors including the retrospective design, the inability to calculate sensitivity and specificity of diagnostic methods, and lack of data on patients with COVID-19, the researchers noted. However, the results provide real-life information for clinicians in centers with current mycological platforms, they concluded.

The study received no outside funding. Dr. Coste had no financial conflicts to disclose.

Neutropenia and radiological findings affected the presentation and diagnosis of pulmonary mucormycosis in adult patients, based on data from 114 individuals.

Diagnosis of pulmonary mucormycosis (PM), an invasive and potentially life-threatening fungal infection, is often delayed because of its variable presentation, wrote Anne Coste, MD, of La Cavale Blanche Hospital and Brest (France) University Hospital, and colleagues.

Improved diagnostic tools including molecular identification and image-guided lung biopsies are now available in many centers, but relations between underlying conditions, clinical presentations, and diagnostic methods have not been described, they said.

In a study published in the journal Chest, the researchers reviewed data from all cases of PM seen at six hospitals in France between 2008 and 2019. PM cases were based on European Organization for Research and Treatment of Cancer and the National Institute of Allergy and Infectious Diseases Mycoses Study Group (EORTC/MSG) criteria. Diabetes and trauma were included as additional host factors, and positive serum or tissue PCR (serum qPCR) were included as mycological evidence. Participants also underwent thoracic computed tomography (CT) scans.

The most common underlying conditions among the 114 patients were hematological malignancy (49%), allogeneic hematopoietic stem-cell transplantation (21%), and solid organ transplantation (17%).

Among the 40% of the cases that involved dissemination, the most common sites were the liver (48%), spleen (48%), brain (44%), and kidneys (37%).

A review of radiology findings showed consolidation in a majority of patients (58%), as well as pleural effusion (52%). Other findings included reversed halo sign (RHS, 26%), halo sign (24%), vascular abnormalities (26%), and cavity (23%).

Bronchoalveolar lavage (BAL) was present in 46 of 96 patients (50%), and transthoracic lung biopsy was used for diagnosis in 8 of 11 (73%) patients with previous negative BALs.

Seventy patients had neutropenia. Overall, patients with neutropenia were significantly more likely than were those without neutropenia to show an angioinvasive presentation that included both RHS and disease dissemination (P < .05).

In addition, serum qPCR was positive in 42 of 53 patients for whom data were available (79%). Serum qPCR was significantly more likely to be positive in neutropenic patients (91% vs. 62%, P = .02). Positive qPCR was associated with an early diagnosis (P = .03) and treatment onset (P = .01).

Possible reasons for the high rate of disseminated PM in the current study may be the large number of patients with pulmonary involvement, use of body CT data, and availability of autopsy results (for 11% of cases), the researchers wrote in their discussion.

Neutropenia and radiological findings influence disease presentation and contribution of diagnostic tools during PM. Serum qPCR is more contributive in neutropenic patients and BAL examination in nonneutropenic patients. Lung biopsies are highly contributive in case of non-contributive BAL.

The findings were limited by several factors including the retrospective design, the inability to calculate sensitivity and specificity of diagnostic methods, and lack of data on patients with COVID-19, the researchers noted. However, the results provide real-life information for clinicians in centers with current mycological platforms, they concluded.

The study received no outside funding. Dr. Coste had no financial conflicts to disclose.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL CHEST

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Woman with transplanted uterus gives birth to boy

Article Type
Changed
Fri, 07/28/2023 - 09:10

A woman who was born without a uterus has given birth to a boy in Alabama.

It’s the first time that a baby has been born to a woman with a transplanted uterus outside of a clinical trial. Officials from University of Alabama–Birmingham Hospital, where the 2-year process took place, said in a statement on July 24 that the birth sets its uterus transplant program on track to perhaps become covered under insurance plans.

The process of uterus transplant, in vitro fertilization, and pregnancy involves 50 medical providers and is open to women who have uterine factor infertility (UFI). The condition may affect up to 5% of reproductive-age women worldwide. Women with UFI cannot carry a pregnancy to term because they were either born without a uterus, had it removed via hysterectomy, or have a uterus that does not function properly.

The woman, whom the hospital identified as Mallory, moved with her family to the Birmingham area to enter the transplant program, which is one of four programs operating in the United States. Mallory learned when she was 17 years old that she was born without a uterus because of Mayer-Rokitansky-Küster-Hauser syndrome. Her first child, a daughter, was born after her sister carried the pregnancy as a surrogate.

Mallory received her uterus from a deceased donor. Her son was born in May.

“As with other types of organ transplants, the woman must take immunosuppressive medications to prevent the body from rejecting the transplanted uterus,” the transplant program’s website states. “After the baby is born and if the woman does not want more children, the transplanted uterus is removed with a hysterectomy procedure, and the woman no longer needs to take antirejection medications.”

“There are all different ways to grow your family if you have uterine factor infertility, but this [uterus transplantation] is what I feel like I knew that I was supposed to do,” Mallory said in a statement. “I mean, just hearing the cry at first was just, you know, mind blowing.”

A version of this article first appeared on WebMD.com.

Publications
Topics
Sections

A woman who was born without a uterus has given birth to a boy in Alabama.

It’s the first time that a baby has been born to a woman with a transplanted uterus outside of a clinical trial. Officials from University of Alabama–Birmingham Hospital, where the 2-year process took place, said in a statement on July 24 that the birth sets its uterus transplant program on track to perhaps become covered under insurance plans.

The process of uterus transplant, in vitro fertilization, and pregnancy involves 50 medical providers and is open to women who have uterine factor infertility (UFI). The condition may affect up to 5% of reproductive-age women worldwide. Women with UFI cannot carry a pregnancy to term because they were either born without a uterus, had it removed via hysterectomy, or have a uterus that does not function properly.

The woman, whom the hospital identified as Mallory, moved with her family to the Birmingham area to enter the transplant program, which is one of four programs operating in the United States. Mallory learned when she was 17 years old that she was born without a uterus because of Mayer-Rokitansky-Küster-Hauser syndrome. Her first child, a daughter, was born after her sister carried the pregnancy as a surrogate.

Mallory received her uterus from a deceased donor. Her son was born in May.

“As with other types of organ transplants, the woman must take immunosuppressive medications to prevent the body from rejecting the transplanted uterus,” the transplant program’s website states. “After the baby is born and if the woman does not want more children, the transplanted uterus is removed with a hysterectomy procedure, and the woman no longer needs to take antirejection medications.”

“There are all different ways to grow your family if you have uterine factor infertility, but this [uterus transplantation] is what I feel like I knew that I was supposed to do,” Mallory said in a statement. “I mean, just hearing the cry at first was just, you know, mind blowing.”

A version of this article first appeared on WebMD.com.

A woman who was born without a uterus has given birth to a boy in Alabama.

It’s the first time that a baby has been born to a woman with a transplanted uterus outside of a clinical trial. Officials from University of Alabama–Birmingham Hospital, where the 2-year process took place, said in a statement on July 24 that the birth sets its uterus transplant program on track to perhaps become covered under insurance plans.

The process of uterus transplant, in vitro fertilization, and pregnancy involves 50 medical providers and is open to women who have uterine factor infertility (UFI). The condition may affect up to 5% of reproductive-age women worldwide. Women with UFI cannot carry a pregnancy to term because they were either born without a uterus, had it removed via hysterectomy, or have a uterus that does not function properly.

The woman, whom the hospital identified as Mallory, moved with her family to the Birmingham area to enter the transplant program, which is one of four programs operating in the United States. Mallory learned when she was 17 years old that she was born without a uterus because of Mayer-Rokitansky-Küster-Hauser syndrome. Her first child, a daughter, was born after her sister carried the pregnancy as a surrogate.

Mallory received her uterus from a deceased donor. Her son was born in May.

“As with other types of organ transplants, the woman must take immunosuppressive medications to prevent the body from rejecting the transplanted uterus,” the transplant program’s website states. “After the baby is born and if the woman does not want more children, the transplanted uterus is removed with a hysterectomy procedure, and the woman no longer needs to take antirejection medications.”

“There are all different ways to grow your family if you have uterine factor infertility, but this [uterus transplantation] is what I feel like I knew that I was supposed to do,” Mallory said in a statement. “I mean, just hearing the cry at first was just, you know, mind blowing.”

A version of this article first appeared on WebMD.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Quizartinib boosts AML survival, regardless of SCT

Article Type
Changed
Tue, 07/18/2023 - 16:34

The addition of quizartinib, a potent, selective type II FLT inhibitor, to standard induction chemotherapy significantly improves overall survival in patients with newly diagnosed acute myeloid leukemia (AML) with a FLT3 internal tandem duplication (FLT3-ITD), regardless of key factors including prior allogeneic hematopoietic cell transplantation (allo-HCT).

The research shows that “FLT3 inhibitors are most effective in patients who are minimal residual disease (MRD) positive before allo-HCT,” first author Richard Schlenk, MD, of Heidelberg (Germany) University Hospital and the German Cancer Research Center, Heidelberg, said in an interview.

The findings are from a post-hoc analysis of the phase 3, multicenter QuANTUM-First trial, which involved patients with the FLT3-ITD mutation, who make up about a quarter of those with AML and who can have shorter survival and increased risk of relapse, compared with patients without the mutation. The current post-hoc analysis of the trial was presented at the European Hematology Association 2023 Congress.

The trial, published in April in The Lancet, showed significant benefits in newly diagnosed patients with FLT3-ITD AML who were treated with quizartinib and standard induction and consolidation therapy and then continued on quizartinib as monotherapy for up to 3 years.

In the trial, quizartinib, combined with standard cytarabine and anthracycline induction and standard cytarabine consolidation chemotherapy, and continued as monotherapy following consolidation, was associated with a significant improvement in overall survival versus placebo (median 31.9 months versus 15.1 months, respectively; hazard ratio, 0.776; P = .0324).

For the post hoc analysis of the trial, the authors sought to evaluate if the effects were observed regardless of whether or not allo-HCT was received – which may not be recommended when patients go into remission after the first round of chemotherapy. The issue is important, as efficacy of other targeted therapy with the FLT3 inhibitors has been associated with allo-HCT treatment.

“Midostaurin, for example is mostly effective if [the drug] is followed by allo-HCT, and much less effective [no significant improvement] without allo-HCT,” Dr. Schlenk said.

The authors also sought to evaluate the relationship between minimal residual disease (MRD) prior to allo-HCT in FLT3-ITD and overall survival.

For the trial, 539 patients, with a median age of 56 were randomized to quizartinib (n = 268) or placebo (n = 271), and 147 patients (54.9%) in the quizartinib arm and 150 (55.4%) in the placebo arm achieved complete remission after induction. The rates of incomplete hematologic recovery (CRi) were 16.8% and 9.6%, respectively.

Of those achieving complete remission, 57.1% of patients on quizartinib and 48.7% of those receiving placebo underwent allo-HCT in the first complete remission. The median time to allo-HCT in the two groups was 3.5 months with quizartinib and 3.3 months for placebo.

Following the completion of allo-HCT, 61 patients (72.6%) receiving quizartinib and 36 (49.3%) receiving placebo started 3 years of continuation monotherapy.

In addition, 115 patients received allo-HCT outside of CR1, including 60 on quizartinib and 55 on placebo.

After adjustment for factors including region, age, and white blood count, patients treated with quizartinib treatment had a significantly higher overall survival (HR, 0.770; P = .0284), as did those receiving allo-HCT in CR1 (HR, 0.424; P < .0001).

Furthermore, patients receiving quizartinib had a longer overall survival regardless of whether they received allo-HCT in CR1 or not.

Of note, quizartinib-treated patients who were MRD positive prior to their allo-HCT transplant had a longer overall survival versus placebo (HR, 0.471); as did those who were MRD negative (HR, 0.717), to a lesser degree.

There were no new safety signals identified among patients undergoing allo-HCT.

Of note, cytomegalovirus infection was more common in the quizartinib group (11.8%) versus placebo (5.5%), while decreased appetite was less common with quizartinib (2.9%) versus placebo (12.1%).

Asked by an audience member about any risk of graft-versus-host disease (GVHD), Dr. Schlenk noted that “no difference between the quizartinib and placebo arms has been observed in GVHD acute and chronic.”

He added that patients “appear to benefit more from quizartinib if they have higher allelic frequency versus lower, overall,” and that younger patients, in general, showed greater benefit from quizartinib versus those over 60.

In general, “we see that for patients receiving allo-HCT transplantation, it’s beneficial to be randomized in the quizartinib arm [while] patients who did not undergo allo-HCT in first complete remission benefit equally when treated with quizartinib versus placebo,” he said in presenting the findings at the EHA meeting.

“And irrespective of pre–allo-HCT MRD status, longer survival was observed in those treated with quizartinib versus placebo, but most benefit was observed in those who were MRD positive.”

Quizartinib was approved in Japan this year in combination with chemotherapy for patients with newly diagnosed AML whose tumors harbor FLT3-ITD mutations.

The drug was granted a priority review by the U.S. Food and Drug Administration in October 2022. While the target action date was in April, a new decision date of July 21, 2023, is expected.

The study was sponsored by Daiichi Sankyo. Dr. Schlenk reported relationships with Daiichi Sankyo and other companies.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

The addition of quizartinib, a potent, selective type II FLT inhibitor, to standard induction chemotherapy significantly improves overall survival in patients with newly diagnosed acute myeloid leukemia (AML) with a FLT3 internal tandem duplication (FLT3-ITD), regardless of key factors including prior allogeneic hematopoietic cell transplantation (allo-HCT).

The research shows that “FLT3 inhibitors are most effective in patients who are minimal residual disease (MRD) positive before allo-HCT,” first author Richard Schlenk, MD, of Heidelberg (Germany) University Hospital and the German Cancer Research Center, Heidelberg, said in an interview.

The findings are from a post-hoc analysis of the phase 3, multicenter QuANTUM-First trial, which involved patients with the FLT3-ITD mutation, who make up about a quarter of those with AML and who can have shorter survival and increased risk of relapse, compared with patients without the mutation. The current post-hoc analysis of the trial was presented at the European Hematology Association 2023 Congress.

The trial, published in April in The Lancet, showed significant benefits in newly diagnosed patients with FLT3-ITD AML who were treated with quizartinib and standard induction and consolidation therapy and then continued on quizartinib as monotherapy for up to 3 years.

In the trial, quizartinib, combined with standard cytarabine and anthracycline induction and standard cytarabine consolidation chemotherapy, and continued as monotherapy following consolidation, was associated with a significant improvement in overall survival versus placebo (median 31.9 months versus 15.1 months, respectively; hazard ratio, 0.776; P = .0324).

For the post hoc analysis of the trial, the authors sought to evaluate if the effects were observed regardless of whether or not allo-HCT was received – which may not be recommended when patients go into remission after the first round of chemotherapy. The issue is important, as efficacy of other targeted therapy with the FLT3 inhibitors has been associated with allo-HCT treatment.

“Midostaurin, for example is mostly effective if [the drug] is followed by allo-HCT, and much less effective [no significant improvement] without allo-HCT,” Dr. Schlenk said.

The authors also sought to evaluate the relationship between minimal residual disease (MRD) prior to allo-HCT in FLT3-ITD and overall survival.

For the trial, 539 patients, with a median age of 56 were randomized to quizartinib (n = 268) or placebo (n = 271), and 147 patients (54.9%) in the quizartinib arm and 150 (55.4%) in the placebo arm achieved complete remission after induction. The rates of incomplete hematologic recovery (CRi) were 16.8% and 9.6%, respectively.

Of those achieving complete remission, 57.1% of patients on quizartinib and 48.7% of those receiving placebo underwent allo-HCT in the first complete remission. The median time to allo-HCT in the two groups was 3.5 months with quizartinib and 3.3 months for placebo.

Following the completion of allo-HCT, 61 patients (72.6%) receiving quizartinib and 36 (49.3%) receiving placebo started 3 years of continuation monotherapy.

In addition, 115 patients received allo-HCT outside of CR1, including 60 on quizartinib and 55 on placebo.

After adjustment for factors including region, age, and white blood count, patients treated with quizartinib treatment had a significantly higher overall survival (HR, 0.770; P = .0284), as did those receiving allo-HCT in CR1 (HR, 0.424; P < .0001).

Furthermore, patients receiving quizartinib had a longer overall survival regardless of whether they received allo-HCT in CR1 or not.

Of note, quizartinib-treated patients who were MRD positive prior to their allo-HCT transplant had a longer overall survival versus placebo (HR, 0.471); as did those who were MRD negative (HR, 0.717), to a lesser degree.

There were no new safety signals identified among patients undergoing allo-HCT.

Of note, cytomegalovirus infection was more common in the quizartinib group (11.8%) versus placebo (5.5%), while decreased appetite was less common with quizartinib (2.9%) versus placebo (12.1%).

Asked by an audience member about any risk of graft-versus-host disease (GVHD), Dr. Schlenk noted that “no difference between the quizartinib and placebo arms has been observed in GVHD acute and chronic.”

He added that patients “appear to benefit more from quizartinib if they have higher allelic frequency versus lower, overall,” and that younger patients, in general, showed greater benefit from quizartinib versus those over 60.

In general, “we see that for patients receiving allo-HCT transplantation, it’s beneficial to be randomized in the quizartinib arm [while] patients who did not undergo allo-HCT in first complete remission benefit equally when treated with quizartinib versus placebo,” he said in presenting the findings at the EHA meeting.

“And irrespective of pre–allo-HCT MRD status, longer survival was observed in those treated with quizartinib versus placebo, but most benefit was observed in those who were MRD positive.”

Quizartinib was approved in Japan this year in combination with chemotherapy for patients with newly diagnosed AML whose tumors harbor FLT3-ITD mutations.

The drug was granted a priority review by the U.S. Food and Drug Administration in October 2022. While the target action date was in April, a new decision date of July 21, 2023, is expected.

The study was sponsored by Daiichi Sankyo. Dr. Schlenk reported relationships with Daiichi Sankyo and other companies.

The addition of quizartinib, a potent, selective type II FLT inhibitor, to standard induction chemotherapy significantly improves overall survival in patients with newly diagnosed acute myeloid leukemia (AML) with a FLT3 internal tandem duplication (FLT3-ITD), regardless of key factors including prior allogeneic hematopoietic cell transplantation (allo-HCT).

The research shows that “FLT3 inhibitors are most effective in patients who are minimal residual disease (MRD) positive before allo-HCT,” first author Richard Schlenk, MD, of Heidelberg (Germany) University Hospital and the German Cancer Research Center, Heidelberg, said in an interview.

The findings are from a post-hoc analysis of the phase 3, multicenter QuANTUM-First trial, which involved patients with the FLT3-ITD mutation, who make up about a quarter of those with AML and who can have shorter survival and increased risk of relapse, compared with patients without the mutation. The current post-hoc analysis of the trial was presented at the European Hematology Association 2023 Congress.

The trial, published in April in The Lancet, showed significant benefits in newly diagnosed patients with FLT3-ITD AML who were treated with quizartinib and standard induction and consolidation therapy and then continued on quizartinib as monotherapy for up to 3 years.

In the trial, quizartinib, combined with standard cytarabine and anthracycline induction and standard cytarabine consolidation chemotherapy, and continued as monotherapy following consolidation, was associated with a significant improvement in overall survival versus placebo (median 31.9 months versus 15.1 months, respectively; hazard ratio, 0.776; P = .0324).

For the post hoc analysis of the trial, the authors sought to evaluate if the effects were observed regardless of whether or not allo-HCT was received – which may not be recommended when patients go into remission after the first round of chemotherapy. The issue is important, as efficacy of other targeted therapy with the FLT3 inhibitors has been associated with allo-HCT treatment.

“Midostaurin, for example is mostly effective if [the drug] is followed by allo-HCT, and much less effective [no significant improvement] without allo-HCT,” Dr. Schlenk said.

The authors also sought to evaluate the relationship between minimal residual disease (MRD) prior to allo-HCT in FLT3-ITD and overall survival.

For the trial, 539 patients, with a median age of 56 were randomized to quizartinib (n = 268) or placebo (n = 271), and 147 patients (54.9%) in the quizartinib arm and 150 (55.4%) in the placebo arm achieved complete remission after induction. The rates of incomplete hematologic recovery (CRi) were 16.8% and 9.6%, respectively.

Of those achieving complete remission, 57.1% of patients on quizartinib and 48.7% of those receiving placebo underwent allo-HCT in the first complete remission. The median time to allo-HCT in the two groups was 3.5 months with quizartinib and 3.3 months for placebo.

Following the completion of allo-HCT, 61 patients (72.6%) receiving quizartinib and 36 (49.3%) receiving placebo started 3 years of continuation monotherapy.

In addition, 115 patients received allo-HCT outside of CR1, including 60 on quizartinib and 55 on placebo.

After adjustment for factors including region, age, and white blood count, patients treated with quizartinib treatment had a significantly higher overall survival (HR, 0.770; P = .0284), as did those receiving allo-HCT in CR1 (HR, 0.424; P < .0001).

Furthermore, patients receiving quizartinib had a longer overall survival regardless of whether they received allo-HCT in CR1 or not.

Of note, quizartinib-treated patients who were MRD positive prior to their allo-HCT transplant had a longer overall survival versus placebo (HR, 0.471); as did those who were MRD negative (HR, 0.717), to a lesser degree.

There were no new safety signals identified among patients undergoing allo-HCT.

Of note, cytomegalovirus infection was more common in the quizartinib group (11.8%) versus placebo (5.5%), while decreased appetite was less common with quizartinib (2.9%) versus placebo (12.1%).

Asked by an audience member about any risk of graft-versus-host disease (GVHD), Dr. Schlenk noted that “no difference between the quizartinib and placebo arms has been observed in GVHD acute and chronic.”

He added that patients “appear to benefit more from quizartinib if they have higher allelic frequency versus lower, overall,” and that younger patients, in general, showed greater benefit from quizartinib versus those over 60.

In general, “we see that for patients receiving allo-HCT transplantation, it’s beneficial to be randomized in the quizartinib arm [while] patients who did not undergo allo-HCT in first complete remission benefit equally when treated with quizartinib versus placebo,” he said in presenting the findings at the EHA meeting.

“And irrespective of pre–allo-HCT MRD status, longer survival was observed in those treated with quizartinib versus placebo, but most benefit was observed in those who were MRD positive.”

Quizartinib was approved in Japan this year in combination with chemotherapy for patients with newly diagnosed AML whose tumors harbor FLT3-ITD mutations.

The drug was granted a priority review by the U.S. Food and Drug Administration in October 2022. While the target action date was in April, a new decision date of July 21, 2023, is expected.

The study was sponsored by Daiichi Sankyo. Dr. Schlenk reported relationships with Daiichi Sankyo and other companies.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE EHA 2023 CONGRESS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Survival similar with hearts donated after circulatory or brain death

Article Type
Changed
Thu, 06/22/2023 - 14:40

Heart transplantation using the new strategy of donation after circulatory death (DCD) resulted in similar 6-month survival among recipients as the traditional method of using hearts donated after brain death (DBD) in the first randomized trial comparing the two approaches.

“This randomized trial showing recipient survival with DCD to be similar to DBD should lead to DCD becoming the standard of care alongside DBD,” lead author Jacob Schroder, MD, surgical director, heart transplantation program, Duke University Medical Center, Durham, N.C., said in an interview.

“This should enable many more heart transplants to take place and for us to be able to cast the net further and wider for donors,” he said.

The trial was published online in the New England Journal of Medicine.

Dr. Schroder estimated that only around one-fifth of the 120 U.S. heart transplant centers currently carry out DCD transplants, but he is hopeful that the publication of this study will encourage more transplant centers to do these DCD procedures.

“The problem is there are many low-volume heart transplant centers, which may not be keen to do DCD transplants as they are a bit more complicated and expensive than DBD heart transplants,” he said. “But we need to look at the big picture of how many lives can be saved by increasing the number of heart transplant procedures and the money saved by getting more patients off the waiting list.”

The authors explain that heart transplantation has traditionally been limited to the use of hearts obtained from donors after brain death, which allows in situ assessment of cardiac function and of the suitability for transplantation of the donor allograft before surgical procurement.

But because the need for heart transplants far exceeds the availability of suitable donors, the use of DCD hearts has been investigated and this approach is now being pursued in many countries. In the DCD approach, the heart will have stopped beating in the donor, and perfusion techniques are used to restart the organ.

There are two different approaches to restarting the heart in DCD. The first approach involves the heart being removed from the donor and reanimated, preserved, assessed, and transported with the use of a portable extracorporeal perfusion and preservation system (Organ Care System, TransMedics). The second involves restarting the heart in the donor’s body for evaluation before removal and transportation under the traditional cold storage method used for donations after brain death.

The current trial was designed to compare clinical outcomes in patients who had received a heart from a circulatory death donor using the portable extracorporeal perfusion method for DCD transplantation, with outcomes from the traditional method of heart transplantation using organs donated after brain death.

For the randomized, noninferiority trial, adult candidates for heart transplantation were assigned to receive a heart after the circulatory death of the donor or a heart from a donor after brain death if that heart was available first (circulatory-death group) or to receive only a heart that had been preserved with the use of traditional cold storage after the brain death of the donor (brain-death group).

The primary end point was the risk-adjusted survival at 6 months in the as-treated circulatory-death group, as compared with the brain-death group. The primary safety end point was serious adverse events associated with the heart graft at 30 days after transplantation.

A total of 180 patients underwent transplantation, 90 of whom received a heart donated after circulatory death and 90 who received a heart donated after brain death. A total of 166 transplant recipients were included in the as-treated primary analysis (80 who received a heart from a circulatory-death donor and 86 who received a heart from a brain-death donor).

The risk-adjusted 6-month survival in the as-treated population was 94% among recipients of a heart from a circulatory-death donor, as compared with 90% among recipients of a heart from a brain-death donor (P < .001 for noninferiority).

There were no substantial between-group differences in the mean per-patient number of serious adverse events associated with the heart graft at 30 days after transplantation.

Of 101 hearts from circulatory-death donors that were preserved with the use of the perfusion system, 90 were successfully transplanted according to the criteria for lactate trend and overall contractility of the donor heart, which resulted in overall utilization percentage of 89%.

More patients who received a heart from a circulatory-death donor had moderate or severe primary graft dysfunction (22%) than those who received a heart from a brain-death donor (10%). However, graft failure that resulted in retransplantation occurred in two (2.3%) patients who received a heart from a brain-death donor versus zero patients who received a heart from a circulatory-death donor.

The researchers note that the higher incidence of primary graft dysfunction in the circulatory-death group is expected, given the period of warm ischemia that occurs in this approach. But they point out that this did not affect patient or graft survival at 30 days or 1 year.

“Primary graft dysfunction is when the heart doesn’t fully work immediately after transplant and some mechanical support is needed,” Dr. Schroder commented to this news organization. “This occurred more often in the DCD group, but this mechanical support is only temporary, and generally only needed for a day or two.

“It looks like it might take the heart a little longer to start fully functioning after DCD, but our results show this doesn’t seem to affect recipient survival.”

He added: “We’ve started to become more comfortable with DCD. Sometimes it may take a little longer to get the heart working properly on its own, but the rate of mechanical support is now much lower than when we first started doing these procedures. And cardiac MRI on the recipient patients before discharge have shown that the DCD hearts are not more damaged than those from DBD donors.”

The authors also report that there were six donor hearts in the DCD group for which there were protocol deviations of functional warm ischemic time greater than 30 minutes or continuously rising lactate levels and these hearts did not show primary graft dysfunction.

On this observation, Dr. Schroder said: “I think we need to do more work on understanding the ischemic time limits. The current 30 minutes time limit was estimated in animal studies. We need to look more closely at data from actual DCD transplants. While 30 minutes may be too long for a heart from an older donor, the heart from a younger donor may be fine for a longer period of ischemic time as it will be healthier.”


 

 

 

“Exciting” results

In an editorial, Nancy K. Sweitzer, MD, PhD, vice chair of clinical research, department of medicine, and director of clinical research, division of cardiology, Washington University in St. Louis, describes the results of the current study as “exciting,” adding that, “They clearly show the feasibility and safety of transplantation of hearts from circulatory-death donors.”

However, Dr. Sweitzer points out that the sickest patients in the study – those who were United Network for Organ Sharing (UNOS) status 1 and 2 – were more likely to receive a DBD heart and the more stable patients (UNOS 3-6) were more likely to receive a DCD heart.

“This imbalance undoubtedly contributed to the success of the trial in meeting its noninferiority end point. Whether transplantation of hearts from circulatory-death donors is truly safe in our sickest patients with heart failure is not clear,” she says.

However, she concludes, “Although caution and continuous evaluation of data are warranted, the increased use of hearts from circulatory-death donors appears to be safe in the hands of experienced transplantation teams and will launch an exciting phase of learning and improvement.”

“A safely expanded pool of heart donors has the potential to increase fairness and equity in heart transplantation, allowing more persons with heart failure to have access to this lifesaving therapy,” she adds. “Organ donors and transplantation teams will save increasing numbers of lives with this most precious gift.”

The current study was supported by TransMedics. Dr. Schroder reports no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Heart transplantation using the new strategy of donation after circulatory death (DCD) resulted in similar 6-month survival among recipients as the traditional method of using hearts donated after brain death (DBD) in the first randomized trial comparing the two approaches.

“This randomized trial showing recipient survival with DCD to be similar to DBD should lead to DCD becoming the standard of care alongside DBD,” lead author Jacob Schroder, MD, surgical director, heart transplantation program, Duke University Medical Center, Durham, N.C., said in an interview.

“This should enable many more heart transplants to take place and for us to be able to cast the net further and wider for donors,” he said.

The trial was published online in the New England Journal of Medicine.

Dr. Schroder estimated that only around one-fifth of the 120 U.S. heart transplant centers currently carry out DCD transplants, but he is hopeful that the publication of this study will encourage more transplant centers to do these DCD procedures.

“The problem is there are many low-volume heart transplant centers, which may not be keen to do DCD transplants as they are a bit more complicated and expensive than DBD heart transplants,” he said. “But we need to look at the big picture of how many lives can be saved by increasing the number of heart transplant procedures and the money saved by getting more patients off the waiting list.”

The authors explain that heart transplantation has traditionally been limited to the use of hearts obtained from donors after brain death, which allows in situ assessment of cardiac function and of the suitability for transplantation of the donor allograft before surgical procurement.

But because the need for heart transplants far exceeds the availability of suitable donors, the use of DCD hearts has been investigated and this approach is now being pursued in many countries. In the DCD approach, the heart will have stopped beating in the donor, and perfusion techniques are used to restart the organ.

There are two different approaches to restarting the heart in DCD. The first approach involves the heart being removed from the donor and reanimated, preserved, assessed, and transported with the use of a portable extracorporeal perfusion and preservation system (Organ Care System, TransMedics). The second involves restarting the heart in the donor’s body for evaluation before removal and transportation under the traditional cold storage method used for donations after brain death.

The current trial was designed to compare clinical outcomes in patients who had received a heart from a circulatory death donor using the portable extracorporeal perfusion method for DCD transplantation, with outcomes from the traditional method of heart transplantation using organs donated after brain death.

For the randomized, noninferiority trial, adult candidates for heart transplantation were assigned to receive a heart after the circulatory death of the donor or a heart from a donor after brain death if that heart was available first (circulatory-death group) or to receive only a heart that had been preserved with the use of traditional cold storage after the brain death of the donor (brain-death group).

The primary end point was the risk-adjusted survival at 6 months in the as-treated circulatory-death group, as compared with the brain-death group. The primary safety end point was serious adverse events associated with the heart graft at 30 days after transplantation.

A total of 180 patients underwent transplantation, 90 of whom received a heart donated after circulatory death and 90 who received a heart donated after brain death. A total of 166 transplant recipients were included in the as-treated primary analysis (80 who received a heart from a circulatory-death donor and 86 who received a heart from a brain-death donor).

The risk-adjusted 6-month survival in the as-treated population was 94% among recipients of a heart from a circulatory-death donor, as compared with 90% among recipients of a heart from a brain-death donor (P < .001 for noninferiority).

There were no substantial between-group differences in the mean per-patient number of serious adverse events associated with the heart graft at 30 days after transplantation.

Of 101 hearts from circulatory-death donors that were preserved with the use of the perfusion system, 90 were successfully transplanted according to the criteria for lactate trend and overall contractility of the donor heart, which resulted in overall utilization percentage of 89%.

More patients who received a heart from a circulatory-death donor had moderate or severe primary graft dysfunction (22%) than those who received a heart from a brain-death donor (10%). However, graft failure that resulted in retransplantation occurred in two (2.3%) patients who received a heart from a brain-death donor versus zero patients who received a heart from a circulatory-death donor.

The researchers note that the higher incidence of primary graft dysfunction in the circulatory-death group is expected, given the period of warm ischemia that occurs in this approach. But they point out that this did not affect patient or graft survival at 30 days or 1 year.

“Primary graft dysfunction is when the heart doesn’t fully work immediately after transplant and some mechanical support is needed,” Dr. Schroder commented to this news organization. “This occurred more often in the DCD group, but this mechanical support is only temporary, and generally only needed for a day or two.

“It looks like it might take the heart a little longer to start fully functioning after DCD, but our results show this doesn’t seem to affect recipient survival.”

He added: “We’ve started to become more comfortable with DCD. Sometimes it may take a little longer to get the heart working properly on its own, but the rate of mechanical support is now much lower than when we first started doing these procedures. And cardiac MRI on the recipient patients before discharge have shown that the DCD hearts are not more damaged than those from DBD donors.”

The authors also report that there were six donor hearts in the DCD group for which there were protocol deviations of functional warm ischemic time greater than 30 minutes or continuously rising lactate levels and these hearts did not show primary graft dysfunction.

On this observation, Dr. Schroder said: “I think we need to do more work on understanding the ischemic time limits. The current 30 minutes time limit was estimated in animal studies. We need to look more closely at data from actual DCD transplants. While 30 minutes may be too long for a heart from an older donor, the heart from a younger donor may be fine for a longer period of ischemic time as it will be healthier.”


 

 

 

“Exciting” results

In an editorial, Nancy K. Sweitzer, MD, PhD, vice chair of clinical research, department of medicine, and director of clinical research, division of cardiology, Washington University in St. Louis, describes the results of the current study as “exciting,” adding that, “They clearly show the feasibility and safety of transplantation of hearts from circulatory-death donors.”

However, Dr. Sweitzer points out that the sickest patients in the study – those who were United Network for Organ Sharing (UNOS) status 1 and 2 – were more likely to receive a DBD heart and the more stable patients (UNOS 3-6) were more likely to receive a DCD heart.

“This imbalance undoubtedly contributed to the success of the trial in meeting its noninferiority end point. Whether transplantation of hearts from circulatory-death donors is truly safe in our sickest patients with heart failure is not clear,” she says.

However, she concludes, “Although caution and continuous evaluation of data are warranted, the increased use of hearts from circulatory-death donors appears to be safe in the hands of experienced transplantation teams and will launch an exciting phase of learning and improvement.”

“A safely expanded pool of heart donors has the potential to increase fairness and equity in heart transplantation, allowing more persons with heart failure to have access to this lifesaving therapy,” she adds. “Organ donors and transplantation teams will save increasing numbers of lives with this most precious gift.”

The current study was supported by TransMedics. Dr. Schroder reports no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Heart transplantation using the new strategy of donation after circulatory death (DCD) resulted in similar 6-month survival among recipients as the traditional method of using hearts donated after brain death (DBD) in the first randomized trial comparing the two approaches.

“This randomized trial showing recipient survival with DCD to be similar to DBD should lead to DCD becoming the standard of care alongside DBD,” lead author Jacob Schroder, MD, surgical director, heart transplantation program, Duke University Medical Center, Durham, N.C., said in an interview.

“This should enable many more heart transplants to take place and for us to be able to cast the net further and wider for donors,” he said.

The trial was published online in the New England Journal of Medicine.

Dr. Schroder estimated that only around one-fifth of the 120 U.S. heart transplant centers currently carry out DCD transplants, but he is hopeful that the publication of this study will encourage more transplant centers to do these DCD procedures.

“The problem is there are many low-volume heart transplant centers, which may not be keen to do DCD transplants as they are a bit more complicated and expensive than DBD heart transplants,” he said. “But we need to look at the big picture of how many lives can be saved by increasing the number of heart transplant procedures and the money saved by getting more patients off the waiting list.”

The authors explain that heart transplantation has traditionally been limited to the use of hearts obtained from donors after brain death, which allows in situ assessment of cardiac function and of the suitability for transplantation of the donor allograft before surgical procurement.

But because the need for heart transplants far exceeds the availability of suitable donors, the use of DCD hearts has been investigated and this approach is now being pursued in many countries. In the DCD approach, the heart will have stopped beating in the donor, and perfusion techniques are used to restart the organ.

There are two different approaches to restarting the heart in DCD. The first approach involves the heart being removed from the donor and reanimated, preserved, assessed, and transported with the use of a portable extracorporeal perfusion and preservation system (Organ Care System, TransMedics). The second involves restarting the heart in the donor’s body for evaluation before removal and transportation under the traditional cold storage method used for donations after brain death.

The current trial was designed to compare clinical outcomes in patients who had received a heart from a circulatory death donor using the portable extracorporeal perfusion method for DCD transplantation, with outcomes from the traditional method of heart transplantation using organs donated after brain death.

For the randomized, noninferiority trial, adult candidates for heart transplantation were assigned to receive a heart after the circulatory death of the donor or a heart from a donor after brain death if that heart was available first (circulatory-death group) or to receive only a heart that had been preserved with the use of traditional cold storage after the brain death of the donor (brain-death group).

The primary end point was the risk-adjusted survival at 6 months in the as-treated circulatory-death group, as compared with the brain-death group. The primary safety end point was serious adverse events associated with the heart graft at 30 days after transplantation.

A total of 180 patients underwent transplantation, 90 of whom received a heart donated after circulatory death and 90 who received a heart donated after brain death. A total of 166 transplant recipients were included in the as-treated primary analysis (80 who received a heart from a circulatory-death donor and 86 who received a heart from a brain-death donor).

The risk-adjusted 6-month survival in the as-treated population was 94% among recipients of a heart from a circulatory-death donor, as compared with 90% among recipients of a heart from a brain-death donor (P < .001 for noninferiority).

There were no substantial between-group differences in the mean per-patient number of serious adverse events associated with the heart graft at 30 days after transplantation.

Of 101 hearts from circulatory-death donors that were preserved with the use of the perfusion system, 90 were successfully transplanted according to the criteria for lactate trend and overall contractility of the donor heart, which resulted in overall utilization percentage of 89%.

More patients who received a heart from a circulatory-death donor had moderate or severe primary graft dysfunction (22%) than those who received a heart from a brain-death donor (10%). However, graft failure that resulted in retransplantation occurred in two (2.3%) patients who received a heart from a brain-death donor versus zero patients who received a heart from a circulatory-death donor.

The researchers note that the higher incidence of primary graft dysfunction in the circulatory-death group is expected, given the period of warm ischemia that occurs in this approach. But they point out that this did not affect patient or graft survival at 30 days or 1 year.

“Primary graft dysfunction is when the heart doesn’t fully work immediately after transplant and some mechanical support is needed,” Dr. Schroder commented to this news organization. “This occurred more often in the DCD group, but this mechanical support is only temporary, and generally only needed for a day or two.

“It looks like it might take the heart a little longer to start fully functioning after DCD, but our results show this doesn’t seem to affect recipient survival.”

He added: “We’ve started to become more comfortable with DCD. Sometimes it may take a little longer to get the heart working properly on its own, but the rate of mechanical support is now much lower than when we first started doing these procedures. And cardiac MRI on the recipient patients before discharge have shown that the DCD hearts are not more damaged than those from DBD donors.”

The authors also report that there were six donor hearts in the DCD group for which there were protocol deviations of functional warm ischemic time greater than 30 minutes or continuously rising lactate levels and these hearts did not show primary graft dysfunction.

On this observation, Dr. Schroder said: “I think we need to do more work on understanding the ischemic time limits. The current 30 minutes time limit was estimated in animal studies. We need to look more closely at data from actual DCD transplants. While 30 minutes may be too long for a heart from an older donor, the heart from a younger donor may be fine for a longer period of ischemic time as it will be healthier.”


 

 

 

“Exciting” results

In an editorial, Nancy K. Sweitzer, MD, PhD, vice chair of clinical research, department of medicine, and director of clinical research, division of cardiology, Washington University in St. Louis, describes the results of the current study as “exciting,” adding that, “They clearly show the feasibility and safety of transplantation of hearts from circulatory-death donors.”

However, Dr. Sweitzer points out that the sickest patients in the study – those who were United Network for Organ Sharing (UNOS) status 1 and 2 – were more likely to receive a DBD heart and the more stable patients (UNOS 3-6) were more likely to receive a DCD heart.

“This imbalance undoubtedly contributed to the success of the trial in meeting its noninferiority end point. Whether transplantation of hearts from circulatory-death donors is truly safe in our sickest patients with heart failure is not clear,” she says.

However, she concludes, “Although caution and continuous evaluation of data are warranted, the increased use of hearts from circulatory-death donors appears to be safe in the hands of experienced transplantation teams and will launch an exciting phase of learning and improvement.”

“A safely expanded pool of heart donors has the potential to increase fairness and equity in heart transplantation, allowing more persons with heart failure to have access to this lifesaving therapy,” she adds. “Organ donors and transplantation teams will save increasing numbers of lives with this most precious gift.”

The current study was supported by TransMedics. Dr. Schroder reports no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE NEW ENGLAND JOURNAL OF MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article