User login
The Official Newspaper of the American Association for Thoracic Surgery
August 15 Deadline: Cardiovascular Thoracic Clinical Care Conference (DC) Scholarship
Applications are now open for the 2015 AATS Critical Care Scholarship, which provides CT surgery residents with the opportunity to learn about critical care by underwriting their attendance at the Cardiovascular Thoracic (CVT) Critical Care Conference. The event will take place October 1-3, 2015 at the Omni Shoreham Hotel in Washington, D.C.
The AATS Graham Foundation funds the scholarship.
Eligibility
Applications must be enrolled in either:
** An ACGME accredited cardiothoracic surgical training program in the United States.
** RCPSC accredited cardiothoracic surgical resident program in Canada.
Benefits
**A stipend of $500 to the application’s institution to help offset the costs of travel and hotel accommodations.
** Free course registration.
Application Process
** Institutions are invited to nominate one resident to participate in this scholarship.
**Additional residents may be nominated by a Program Director. They will be put on the waiting list until the application process has been completed. If vacancies become available, those on the waiting list will be accepted on a first-come, first-served basis.
Up to 30 Scholarships will be awarded on a first-come, first-serve basis!
Deadline: August 15, 2015
Applications are now open for the 2015 AATS Critical Care Scholarship, which provides CT surgery residents with the opportunity to learn about critical care by underwriting their attendance at the Cardiovascular Thoracic (CVT) Critical Care Conference. The event will take place October 1-3, 2015 at the Omni Shoreham Hotel in Washington, D.C.
The AATS Graham Foundation funds the scholarship.
Eligibility
Applications must be enrolled in either:
** An ACGME accredited cardiothoracic surgical training program in the United States.
** RCPSC accredited cardiothoracic surgical resident program in Canada.
Benefits
**A stipend of $500 to the application’s institution to help offset the costs of travel and hotel accommodations.
** Free course registration.
Application Process
** Institutions are invited to nominate one resident to participate in this scholarship.
**Additional residents may be nominated by a Program Director. They will be put on the waiting list until the application process has been completed. If vacancies become available, those on the waiting list will be accepted on a first-come, first-served basis.
Up to 30 Scholarships will be awarded on a first-come, first-serve basis!
Deadline: August 15, 2015
Applications are now open for the 2015 AATS Critical Care Scholarship, which provides CT surgery residents with the opportunity to learn about critical care by underwriting their attendance at the Cardiovascular Thoracic (CVT) Critical Care Conference. The event will take place October 1-3, 2015 at the Omni Shoreham Hotel in Washington, D.C.
The AATS Graham Foundation funds the scholarship.
Eligibility
Applications must be enrolled in either:
** An ACGME accredited cardiothoracic surgical training program in the United States.
** RCPSC accredited cardiothoracic surgical resident program in Canada.
Benefits
**A stipend of $500 to the application’s institution to help offset the costs of travel and hotel accommodations.
** Free course registration.
Application Process
** Institutions are invited to nominate one resident to participate in this scholarship.
**Additional residents may be nominated by a Program Director. They will be put on the waiting list until the application process has been completed. If vacancies become available, those on the waiting list will be accepted on a first-come, first-served basis.
Up to 30 Scholarships will be awarded on a first-come, first-serve basis!
Deadline: August 15, 2015
Lung cancer biomarker moves into the clinic
SEATTLE – A new biomarker for bronchial epithelium that helps identify smokers with suspicious lesions who have lung cancer is now ready for clinical use. And one for nasal epithelium that could be used for screening may not be far behind.
“There is clearly a critical unmet need to develop molecular biomarkers to address some of the challenges that we now face since we have instituted CT screening for lung cancer,” Dr. Avi Spira said at a joint meeting of the Global Biomarkers Consortium and World Cutaneous Malignancies Congress.
Although the National Lung Screening Trial established that annual chest CT among high-risk current and former smokers reduces their risk of death from lung cancer (N Engl J Med. 2011;365:395-409), the vast majority of those who screen positive do not have lung cancer. Also, screening only patients who meet criteria set by the trial will pick up less than half of all lung cancers in the United States.
“That leads to two critical unmet needs for molecular biomarkers in the so-called post–National Lung Screening Trial era,” said Dr. Spira, professor of medicine, pathology and laboratory medicine, and bioinformatics; chief of the division of computational biomedicine; and director of the translational bioinformatics program, Clinical and Translational Science Institute, all at Boston University.
“The first is … we desperately need molecular biomarkers that can distinguish a benign nodule found on CT versus a malignant one,” he said. “The second and arguably longer-term biomarker that we need is to distinguish which smokers would benefit from CT screening annually.”
Much of his team’s research in this area builds on the concept of field of injury. “The idea here is if you smoke, even though lung cancer tends to develop deep within the parenchyma of your lung, all of the epithelial cells that line your respiratory tract have genomic alterations that reflect the presence of that cancer,” Dr. Spira explained. Thus, profiling epithelial cells anywhere in the airway could be used for early detection and risk assessment.
He and his colleagues developed a 23-gene signature for use on bronchial epithelial cells. The biomarker was validated in the Airway Epithelium Gene Expression In the Diagnosis of Lung Cancer (AEGIS) 1 and 2 trials among 639 current and former smokers undergoing bronchoscopy for suspicious nodules seen on CT.
With 1 year of follow-up, biomarker sensitivity was 88%-89%, while specificity was 47% (N Engl J Med. 2015;373:243-251). “However the negative predictive value, which is really what drives the clinical utility of this test, is above 90%. And that’s what we believe will drive physicians to use the test – [determining] who can they avoid sending for an unnecessary [biopsy] procedure,” Dr. Spira said. Bronchoscopy alone had sensitivity of about 75%, but bronchoscopy combined with the gene signature had sensitivity of 97%.
Subgroup analyses showed the biomarker had superior sensitivity for detecting lung cancer when lesions measured no more than 3 cm or were located in the lung periphery, and when patients had early-stage disease. In addition, it performed similarly well across different types of tumors.
Of special note, among patients whose pretest probability of cancer fell in the intermediate range (10%-60%), bronchoscopy had an 83% nondiagnostic rate, but the biomarker had 88% sensitivity and a 91% negative predictive value. “That means if you have a nondiagnostic bronchoscopy in a patient who is at intermediate pretest risk for disease, a negative gene expression test would mean there is a less than 10% chance this is cancer. That’s where a physician might feel, okay, I don’t have to go on and do a biopsy, I can watch that patient serially with CT scans of the chest,” Dr. Spira said.
The biomarker test is now clinically available (Percepta, manufactured by Veracyte). “I think it’s exciting because it’s the first of what I believe are many molecular biomarkers that are going to be emerging in the clinical space for the early detection of lung cancer,” he said.
“The multimillion dollar question is why are we seeing gene expression changes in normal-appearing cells so far away from where the tumor arises? We don’t have the full answer to that yet, but based on the genes that are changing, we have developed some hypotheses,” Dr. Spira said.
Some of the down-regulated genes are involved in antioxidant and DNA repair pathways, suggesting that the smokers who ultimately get cancer have less of a protective response to smoking. And some of the up-regulated genes include ones in the PI3 kinase signaling pathway.
“I would argue that what we are seeing in the proximal airway isn’t necessarily reflecting the presence of the cancer but the susceptibility, and that’s a really important distinguishing factor because then perhaps the test could be used as a screening tool,” Dr. Spira maintained.
As not all smokers at elevated risk for lung cancer will undergo bronchoscopy, one of the investigators’ future goals is to move biomarker testing to a less invasive site. They are currently focusing on the nose, using nasal epithelium collected by brushings from the inferior turbinate.
An analysis of nasal epithelium collected at the time of bronchoscopy in the AEGIS trials has shown that a 200-gene signature performs well for distinguishing between patients with and without lung cancer, Dr. Spira reported. Furthermore, the changes in gene expression profile in the nose were similar to those seen in the bronchus.
Such a biomarker might have best clinical utility in two other settings, he proposed. The first would be in patients having nodules that are clearly not accessible by bronchoscopy, in which case the biomarker would be applied for diagnosis. The second would be in smokers being seen for routine annual exams, in which case it would be used to identify those who should have CT surveillance.
“We are hopeful that the nasal epithelium can serve as a less invasive surrogate for the bronchus and ultimately allow us to move airway profiling into the screening setting for lung cancer in the longer term,” he concluded.
Dr. Spira disclosed that he receives intellectual property rights and consulting fees from, and has an ownership interest in, Veracyte Inc.
SEATTLE – A new biomarker for bronchial epithelium that helps identify smokers with suspicious lesions who have lung cancer is now ready for clinical use. And one for nasal epithelium that could be used for screening may not be far behind.
“There is clearly a critical unmet need to develop molecular biomarkers to address some of the challenges that we now face since we have instituted CT screening for lung cancer,” Dr. Avi Spira said at a joint meeting of the Global Biomarkers Consortium and World Cutaneous Malignancies Congress.
Although the National Lung Screening Trial established that annual chest CT among high-risk current and former smokers reduces their risk of death from lung cancer (N Engl J Med. 2011;365:395-409), the vast majority of those who screen positive do not have lung cancer. Also, screening only patients who meet criteria set by the trial will pick up less than half of all lung cancers in the United States.
“That leads to two critical unmet needs for molecular biomarkers in the so-called post–National Lung Screening Trial era,” said Dr. Spira, professor of medicine, pathology and laboratory medicine, and bioinformatics; chief of the division of computational biomedicine; and director of the translational bioinformatics program, Clinical and Translational Science Institute, all at Boston University.
“The first is … we desperately need molecular biomarkers that can distinguish a benign nodule found on CT versus a malignant one,” he said. “The second and arguably longer-term biomarker that we need is to distinguish which smokers would benefit from CT screening annually.”
Much of his team’s research in this area builds on the concept of field of injury. “The idea here is if you smoke, even though lung cancer tends to develop deep within the parenchyma of your lung, all of the epithelial cells that line your respiratory tract have genomic alterations that reflect the presence of that cancer,” Dr. Spira explained. Thus, profiling epithelial cells anywhere in the airway could be used for early detection and risk assessment.
He and his colleagues developed a 23-gene signature for use on bronchial epithelial cells. The biomarker was validated in the Airway Epithelium Gene Expression In the Diagnosis of Lung Cancer (AEGIS) 1 and 2 trials among 639 current and former smokers undergoing bronchoscopy for suspicious nodules seen on CT.
With 1 year of follow-up, biomarker sensitivity was 88%-89%, while specificity was 47% (N Engl J Med. 2015;373:243-251). “However the negative predictive value, which is really what drives the clinical utility of this test, is above 90%. And that’s what we believe will drive physicians to use the test – [determining] who can they avoid sending for an unnecessary [biopsy] procedure,” Dr. Spira said. Bronchoscopy alone had sensitivity of about 75%, but bronchoscopy combined with the gene signature had sensitivity of 97%.
Subgroup analyses showed the biomarker had superior sensitivity for detecting lung cancer when lesions measured no more than 3 cm or were located in the lung periphery, and when patients had early-stage disease. In addition, it performed similarly well across different types of tumors.
Of special note, among patients whose pretest probability of cancer fell in the intermediate range (10%-60%), bronchoscopy had an 83% nondiagnostic rate, but the biomarker had 88% sensitivity and a 91% negative predictive value. “That means if you have a nondiagnostic bronchoscopy in a patient who is at intermediate pretest risk for disease, a negative gene expression test would mean there is a less than 10% chance this is cancer. That’s where a physician might feel, okay, I don’t have to go on and do a biopsy, I can watch that patient serially with CT scans of the chest,” Dr. Spira said.
The biomarker test is now clinically available (Percepta, manufactured by Veracyte). “I think it’s exciting because it’s the first of what I believe are many molecular biomarkers that are going to be emerging in the clinical space for the early detection of lung cancer,” he said.
“The multimillion dollar question is why are we seeing gene expression changes in normal-appearing cells so far away from where the tumor arises? We don’t have the full answer to that yet, but based on the genes that are changing, we have developed some hypotheses,” Dr. Spira said.
Some of the down-regulated genes are involved in antioxidant and DNA repair pathways, suggesting that the smokers who ultimately get cancer have less of a protective response to smoking. And some of the up-regulated genes include ones in the PI3 kinase signaling pathway.
“I would argue that what we are seeing in the proximal airway isn’t necessarily reflecting the presence of the cancer but the susceptibility, and that’s a really important distinguishing factor because then perhaps the test could be used as a screening tool,” Dr. Spira maintained.
As not all smokers at elevated risk for lung cancer will undergo bronchoscopy, one of the investigators’ future goals is to move biomarker testing to a less invasive site. They are currently focusing on the nose, using nasal epithelium collected by brushings from the inferior turbinate.
An analysis of nasal epithelium collected at the time of bronchoscopy in the AEGIS trials has shown that a 200-gene signature performs well for distinguishing between patients with and without lung cancer, Dr. Spira reported. Furthermore, the changes in gene expression profile in the nose were similar to those seen in the bronchus.
Such a biomarker might have best clinical utility in two other settings, he proposed. The first would be in patients having nodules that are clearly not accessible by bronchoscopy, in which case the biomarker would be applied for diagnosis. The second would be in smokers being seen for routine annual exams, in which case it would be used to identify those who should have CT surveillance.
“We are hopeful that the nasal epithelium can serve as a less invasive surrogate for the bronchus and ultimately allow us to move airway profiling into the screening setting for lung cancer in the longer term,” he concluded.
Dr. Spira disclosed that he receives intellectual property rights and consulting fees from, and has an ownership interest in, Veracyte Inc.
SEATTLE – A new biomarker for bronchial epithelium that helps identify smokers with suspicious lesions who have lung cancer is now ready for clinical use. And one for nasal epithelium that could be used for screening may not be far behind.
“There is clearly a critical unmet need to develop molecular biomarkers to address some of the challenges that we now face since we have instituted CT screening for lung cancer,” Dr. Avi Spira said at a joint meeting of the Global Biomarkers Consortium and World Cutaneous Malignancies Congress.
Although the National Lung Screening Trial established that annual chest CT among high-risk current and former smokers reduces their risk of death from lung cancer (N Engl J Med. 2011;365:395-409), the vast majority of those who screen positive do not have lung cancer. Also, screening only patients who meet criteria set by the trial will pick up less than half of all lung cancers in the United States.
“That leads to two critical unmet needs for molecular biomarkers in the so-called post–National Lung Screening Trial era,” said Dr. Spira, professor of medicine, pathology and laboratory medicine, and bioinformatics; chief of the division of computational biomedicine; and director of the translational bioinformatics program, Clinical and Translational Science Institute, all at Boston University.
“The first is … we desperately need molecular biomarkers that can distinguish a benign nodule found on CT versus a malignant one,” he said. “The second and arguably longer-term biomarker that we need is to distinguish which smokers would benefit from CT screening annually.”
Much of his team’s research in this area builds on the concept of field of injury. “The idea here is if you smoke, even though lung cancer tends to develop deep within the parenchyma of your lung, all of the epithelial cells that line your respiratory tract have genomic alterations that reflect the presence of that cancer,” Dr. Spira explained. Thus, profiling epithelial cells anywhere in the airway could be used for early detection and risk assessment.
He and his colleagues developed a 23-gene signature for use on bronchial epithelial cells. The biomarker was validated in the Airway Epithelium Gene Expression In the Diagnosis of Lung Cancer (AEGIS) 1 and 2 trials among 639 current and former smokers undergoing bronchoscopy for suspicious nodules seen on CT.
With 1 year of follow-up, biomarker sensitivity was 88%-89%, while specificity was 47% (N Engl J Med. 2015;373:243-251). “However the negative predictive value, which is really what drives the clinical utility of this test, is above 90%. And that’s what we believe will drive physicians to use the test – [determining] who can they avoid sending for an unnecessary [biopsy] procedure,” Dr. Spira said. Bronchoscopy alone had sensitivity of about 75%, but bronchoscopy combined with the gene signature had sensitivity of 97%.
Subgroup analyses showed the biomarker had superior sensitivity for detecting lung cancer when lesions measured no more than 3 cm or were located in the lung periphery, and when patients had early-stage disease. In addition, it performed similarly well across different types of tumors.
Of special note, among patients whose pretest probability of cancer fell in the intermediate range (10%-60%), bronchoscopy had an 83% nondiagnostic rate, but the biomarker had 88% sensitivity and a 91% negative predictive value. “That means if you have a nondiagnostic bronchoscopy in a patient who is at intermediate pretest risk for disease, a negative gene expression test would mean there is a less than 10% chance this is cancer. That’s where a physician might feel, okay, I don’t have to go on and do a biopsy, I can watch that patient serially with CT scans of the chest,” Dr. Spira said.
The biomarker test is now clinically available (Percepta, manufactured by Veracyte). “I think it’s exciting because it’s the first of what I believe are many molecular biomarkers that are going to be emerging in the clinical space for the early detection of lung cancer,” he said.
“The multimillion dollar question is why are we seeing gene expression changes in normal-appearing cells so far away from where the tumor arises? We don’t have the full answer to that yet, but based on the genes that are changing, we have developed some hypotheses,” Dr. Spira said.
Some of the down-regulated genes are involved in antioxidant and DNA repair pathways, suggesting that the smokers who ultimately get cancer have less of a protective response to smoking. And some of the up-regulated genes include ones in the PI3 kinase signaling pathway.
“I would argue that what we are seeing in the proximal airway isn’t necessarily reflecting the presence of the cancer but the susceptibility, and that’s a really important distinguishing factor because then perhaps the test could be used as a screening tool,” Dr. Spira maintained.
As not all smokers at elevated risk for lung cancer will undergo bronchoscopy, one of the investigators’ future goals is to move biomarker testing to a less invasive site. They are currently focusing on the nose, using nasal epithelium collected by brushings from the inferior turbinate.
An analysis of nasal epithelium collected at the time of bronchoscopy in the AEGIS trials has shown that a 200-gene signature performs well for distinguishing between patients with and without lung cancer, Dr. Spira reported. Furthermore, the changes in gene expression profile in the nose were similar to those seen in the bronchus.
Such a biomarker might have best clinical utility in two other settings, he proposed. The first would be in patients having nodules that are clearly not accessible by bronchoscopy, in which case the biomarker would be applied for diagnosis. The second would be in smokers being seen for routine annual exams, in which case it would be used to identify those who should have CT surveillance.
“We are hopeful that the nasal epithelium can serve as a less invasive surrogate for the bronchus and ultimately allow us to move airway profiling into the screening setting for lung cancer in the longer term,” he concluded.
Dr. Spira disclosed that he receives intellectual property rights and consulting fees from, and has an ownership interest in, Veracyte Inc.
AT THE GLOBAL BIOMARKERS CONSORTIUM CONFERENCE
‘David technique’ may enhance aortic repair
Many techniques for repair of aortic dissection have evolved, but no trials have compared those techniques to determine which is the best. However, a study team has attempted to evaluate a surgical approach (the “David technique”) that includes three specific steps – no aortic cross clamp, the use of deep hypothermic circulatory arrest (DHCA), and the antegrade resumption of cardiopulmonary bypass. They found that this approach yielded significantly better long-term outcomes than did other approaches tried.
The study investigators, led by Dr. Jennifer S. Lawton of Washington University in St. Louis, reported their findings in the Journal of Thoracic and Cardiovascular Surgery (J. Thorac. Cardiovasc. Surg. 2015 [doi:10.1016/j.jtcvs.2015.03.023]). “We hypothesized that a surgical strategy to prevent cross-clamp injury or false lumen pressurization would be associated with reduced morbidity, mortality, persistent false lumen patency, and improved survival,” Dr. Lawton and her coauthors wrote. “This study was designed to determine the differences in outcomes between operative techniques.”
The study evaluated 196 patients who had surgery for acute type A aortic dissection over 17 years. Group 1, which comprised 49 patients, had the operation according to the protocol that involved the three specific steps, as Dr. Tirone David of the University of Toronto first reported in 1999 (Ann. Thorac. Surg. 1999;67:1999-2001) — the “David technique,” as the study authors called it. Group 2 consisted of patients whose repair involved a variety of techniques, including one or two steps of the David technique but not all three.
Study endpoints were 30-day mortality rate, postoperative adverse events, presence of a false aortic lumen, and overall survival, the latter defined as the time from the date of surgery to the date or death or last follow-up. The evaluation included examination of patients’ latest CT scan or MRI that was at least 6 months after the operation for false lumen, but only 78 patients had imaging at that interval.
Patients in Group 1 had a higher rate of persistent false lumen – 74% vs. 68% in Group 2. Thirty-day mortality was 6.1% in Group 1 and 15.7% in Group 2, but Dr. Lawton and her coauthors said this difference was not statistically significant.
Survival rates at 1, 5 and 10 years among both groups were “consistent with published ranges,” the authors said. At 5 years, the predicted survival was 86% for Group 1 and 56% for Group 2; and at 10 years, 72% and 37% respectively.
The study authors acknowledged the controversy that surrounds the use of retrograde resumption of cardiopulmonary bypass after replacement of the ascending aorta and that there’s no consensus on which method is best for resuming cardiopulmonary bypass after repair of a type A aortic dissection.
The study also found no difference in the incidence of false lumen between the two groups, but again, this is a source of controversy. “Persistence of a false lumen following repair for type A aortic dissection has been reported to be associated with poor prognosis and reduced long-term survival,” Dr. Lawton and her colleagues said.
“Others have reported a patent false lumen was not an independent predictor of late reoperation, but was a predictor of aortic growth following repair of type A aortic dissection.”
The study authors said one limit of their findings is its retrospective nature, but they also said that a prospective, randomized trial would be difficult to conduct.
None of the study coauthors had any relationships to disclose. They presented their original data at the American Association for Thoracic Surgery Aortic Symposium, April 24-25, 2014, in New York.
Whether or not to use a cross-clamp in type A aortic dissection repair is a critical question, but a major concern of this study was the wide variability of techniques used in the comparison group, Dr. Richard J. Shemin said in his invited commentary (J. Thorac. Cardiovasc. Surg. 2015 [doi:10.1016/j.jtcvs.2015.04.038]). “The variety of approaches attests to the lack of institutional agreement on the surgical principles tested in the study,” he said. “The large variety of techniques in the control group makes the comparison and interpretation of this study difficult.”
“There are more questions to consider from this study than answers derived from the data about the clamp strategy,” he said
But, Dr. Shemin said, using the cross-clamp with axillary antegrade perfusion is “not a major issue.” And the use of clamping during the cooling period can save overall cardiac arrest time during the operation.
“If one does use femoral cannulation, then not applying the cross-clamp until achieving circulatory arrest is prudent,” he said. “With axillary cannulation, one achieves antegrade perfusion so early cross-clamping can be safely performed with the advantages of saving operative time.”
The clamp site must be inspected during circulatory arrest. Antegrade cerebral perfusion is proven to be an excellent technique and is facilitated by right axillary cannulation, Dr. Shemin said. “Most importantly, establishing antegrade CPB [cardiopulmonary bypass] perfusion after circulatory arrest is mandatory in all cases to minimize distal aorta trauma,” he said.
Dr. Richard J. Shemin is a cardiothoracic surgeon at UCLA Medical Center, Santa Monica, Calif.
Whether or not to use a cross-clamp in type A aortic dissection repair is a critical question, but a major concern of this study was the wide variability of techniques used in the comparison group, Dr. Richard J. Shemin said in his invited commentary (J. Thorac. Cardiovasc. Surg. 2015 [doi:10.1016/j.jtcvs.2015.04.038]). “The variety of approaches attests to the lack of institutional agreement on the surgical principles tested in the study,” he said. “The large variety of techniques in the control group makes the comparison and interpretation of this study difficult.”
“There are more questions to consider from this study than answers derived from the data about the clamp strategy,” he said
But, Dr. Shemin said, using the cross-clamp with axillary antegrade perfusion is “not a major issue.” And the use of clamping during the cooling period can save overall cardiac arrest time during the operation.
“If one does use femoral cannulation, then not applying the cross-clamp until achieving circulatory arrest is prudent,” he said. “With axillary cannulation, one achieves antegrade perfusion so early cross-clamping can be safely performed with the advantages of saving operative time.”
The clamp site must be inspected during circulatory arrest. Antegrade cerebral perfusion is proven to be an excellent technique and is facilitated by right axillary cannulation, Dr. Shemin said. “Most importantly, establishing antegrade CPB [cardiopulmonary bypass] perfusion after circulatory arrest is mandatory in all cases to minimize distal aorta trauma,” he said.
Dr. Richard J. Shemin is a cardiothoracic surgeon at UCLA Medical Center, Santa Monica, Calif.
Whether or not to use a cross-clamp in type A aortic dissection repair is a critical question, but a major concern of this study was the wide variability of techniques used in the comparison group, Dr. Richard J. Shemin said in his invited commentary (J. Thorac. Cardiovasc. Surg. 2015 [doi:10.1016/j.jtcvs.2015.04.038]). “The variety of approaches attests to the lack of institutional agreement on the surgical principles tested in the study,” he said. “The large variety of techniques in the control group makes the comparison and interpretation of this study difficult.”
“There are more questions to consider from this study than answers derived from the data about the clamp strategy,” he said
But, Dr. Shemin said, using the cross-clamp with axillary antegrade perfusion is “not a major issue.” And the use of clamping during the cooling period can save overall cardiac arrest time during the operation.
“If one does use femoral cannulation, then not applying the cross-clamp until achieving circulatory arrest is prudent,” he said. “With axillary cannulation, one achieves antegrade perfusion so early cross-clamping can be safely performed with the advantages of saving operative time.”
The clamp site must be inspected during circulatory arrest. Antegrade cerebral perfusion is proven to be an excellent technique and is facilitated by right axillary cannulation, Dr. Shemin said. “Most importantly, establishing antegrade CPB [cardiopulmonary bypass] perfusion after circulatory arrest is mandatory in all cases to minimize distal aorta trauma,” he said.
Dr. Richard J. Shemin is a cardiothoracic surgeon at UCLA Medical Center, Santa Monica, Calif.
Many techniques for repair of aortic dissection have evolved, but no trials have compared those techniques to determine which is the best. However, a study team has attempted to evaluate a surgical approach (the “David technique”) that includes three specific steps – no aortic cross clamp, the use of deep hypothermic circulatory arrest (DHCA), and the antegrade resumption of cardiopulmonary bypass. They found that this approach yielded significantly better long-term outcomes than did other approaches tried.
The study investigators, led by Dr. Jennifer S. Lawton of Washington University in St. Louis, reported their findings in the Journal of Thoracic and Cardiovascular Surgery (J. Thorac. Cardiovasc. Surg. 2015 [doi:10.1016/j.jtcvs.2015.03.023]). “We hypothesized that a surgical strategy to prevent cross-clamp injury or false lumen pressurization would be associated with reduced morbidity, mortality, persistent false lumen patency, and improved survival,” Dr. Lawton and her coauthors wrote. “This study was designed to determine the differences in outcomes between operative techniques.”
The study evaluated 196 patients who had surgery for acute type A aortic dissection over 17 years. Group 1, which comprised 49 patients, had the operation according to the protocol that involved the three specific steps, as Dr. Tirone David of the University of Toronto first reported in 1999 (Ann. Thorac. Surg. 1999;67:1999-2001) — the “David technique,” as the study authors called it. Group 2 consisted of patients whose repair involved a variety of techniques, including one or two steps of the David technique but not all three.
Study endpoints were 30-day mortality rate, postoperative adverse events, presence of a false aortic lumen, and overall survival, the latter defined as the time from the date of surgery to the date or death or last follow-up. The evaluation included examination of patients’ latest CT scan or MRI that was at least 6 months after the operation for false lumen, but only 78 patients had imaging at that interval.
Patients in Group 1 had a higher rate of persistent false lumen – 74% vs. 68% in Group 2. Thirty-day mortality was 6.1% in Group 1 and 15.7% in Group 2, but Dr. Lawton and her coauthors said this difference was not statistically significant.
Survival rates at 1, 5 and 10 years among both groups were “consistent with published ranges,” the authors said. At 5 years, the predicted survival was 86% for Group 1 and 56% for Group 2; and at 10 years, 72% and 37% respectively.
The study authors acknowledged the controversy that surrounds the use of retrograde resumption of cardiopulmonary bypass after replacement of the ascending aorta and that there’s no consensus on which method is best for resuming cardiopulmonary bypass after repair of a type A aortic dissection.
The study also found no difference in the incidence of false lumen between the two groups, but again, this is a source of controversy. “Persistence of a false lumen following repair for type A aortic dissection has been reported to be associated with poor prognosis and reduced long-term survival,” Dr. Lawton and her colleagues said.
“Others have reported a patent false lumen was not an independent predictor of late reoperation, but was a predictor of aortic growth following repair of type A aortic dissection.”
The study authors said one limit of their findings is its retrospective nature, but they also said that a prospective, randomized trial would be difficult to conduct.
None of the study coauthors had any relationships to disclose. They presented their original data at the American Association for Thoracic Surgery Aortic Symposium, April 24-25, 2014, in New York.
Many techniques for repair of aortic dissection have evolved, but no trials have compared those techniques to determine which is the best. However, a study team has attempted to evaluate a surgical approach (the “David technique”) that includes three specific steps – no aortic cross clamp, the use of deep hypothermic circulatory arrest (DHCA), and the antegrade resumption of cardiopulmonary bypass. They found that this approach yielded significantly better long-term outcomes than did other approaches tried.
The study investigators, led by Dr. Jennifer S. Lawton of Washington University in St. Louis, reported their findings in the Journal of Thoracic and Cardiovascular Surgery (J. Thorac. Cardiovasc. Surg. 2015 [doi:10.1016/j.jtcvs.2015.03.023]). “We hypothesized that a surgical strategy to prevent cross-clamp injury or false lumen pressurization would be associated with reduced morbidity, mortality, persistent false lumen patency, and improved survival,” Dr. Lawton and her coauthors wrote. “This study was designed to determine the differences in outcomes between operative techniques.”
The study evaluated 196 patients who had surgery for acute type A aortic dissection over 17 years. Group 1, which comprised 49 patients, had the operation according to the protocol that involved the three specific steps, as Dr. Tirone David of the University of Toronto first reported in 1999 (Ann. Thorac. Surg. 1999;67:1999-2001) — the “David technique,” as the study authors called it. Group 2 consisted of patients whose repair involved a variety of techniques, including one or two steps of the David technique but not all three.
Study endpoints were 30-day mortality rate, postoperative adverse events, presence of a false aortic lumen, and overall survival, the latter defined as the time from the date of surgery to the date or death or last follow-up. The evaluation included examination of patients’ latest CT scan or MRI that was at least 6 months after the operation for false lumen, but only 78 patients had imaging at that interval.
Patients in Group 1 had a higher rate of persistent false lumen – 74% vs. 68% in Group 2. Thirty-day mortality was 6.1% in Group 1 and 15.7% in Group 2, but Dr. Lawton and her coauthors said this difference was not statistically significant.
Survival rates at 1, 5 and 10 years among both groups were “consistent with published ranges,” the authors said. At 5 years, the predicted survival was 86% for Group 1 and 56% for Group 2; and at 10 years, 72% and 37% respectively.
The study authors acknowledged the controversy that surrounds the use of retrograde resumption of cardiopulmonary bypass after replacement of the ascending aorta and that there’s no consensus on which method is best for resuming cardiopulmonary bypass after repair of a type A aortic dissection.
The study also found no difference in the incidence of false lumen between the two groups, but again, this is a source of controversy. “Persistence of a false lumen following repair for type A aortic dissection has been reported to be associated with poor prognosis and reduced long-term survival,” Dr. Lawton and her colleagues said.
“Others have reported a patent false lumen was not an independent predictor of late reoperation, but was a predictor of aortic growth following repair of type A aortic dissection.”
The study authors said one limit of their findings is its retrospective nature, but they also said that a prospective, randomized trial would be difficult to conduct.
None of the study coauthors had any relationships to disclose. They presented their original data at the American Association for Thoracic Surgery Aortic Symposium, April 24-25, 2014, in New York.
Key clinical point: An operation to repair type A aortic dissection that involves three specific steps achieves better outcomes than do other surgical approaches.
Major finding: Survival rates at 5 years were 86% for the group that had operations in which the surgeons used the three specific steps vs. 56% for the other group.
Data source: Retrospective analysis of single-center population of 146 patients who had repairs for type A aortic dissection.
Disclosures: None of the study coauthors had any relationships to disclose.
Similar outcomes for salvage vs. planned surgery after chemoradiotherapy in esophageal cancer
For patients with esophageal cancer, salvage surgery after definitive chemoradiotherapy had similar mortality and morbidity rates, and similar survival outcomes, to the combination of neoadjuvant chemoradiation and planned surgery, according to a study published online in the Journal of Clinical Oncology.
Definitive chemoradiotherapy (dCRT) is an alternative to highly invasive surgical resection, which carries a significant rate of morbidity and mortality; however, recent data indicate that 50% of patients with complete response to dCRT experience tumor recurrence.
“Our study demonstrated a similar survival and recurrence pattern for the SALV [salvage surgery after definitive chemoradiotherapy] and NCRS [neoadjuvant chemoradiation and planned surgery] groups, potentially validating an approach of dCRT with reserved SALV for persistent or recurrent disease. Importantly, there were no differences in oncologic safety of surgery, including extent of nodal dissection, between the SALV and NCRS groups,” wrote Dr. Sheraz Markar, a clinical research fellow from Imperial College, London, and colleagues (Journ. Clin. Onc. 2015 July 20 [doi:10.1200/JCO.2014.59.9092]).
The retrospective study compared 308 patients with esophageal cancer who underwent SALV with 540 patients who received NCRS at European centers from 2000 to 2010. After a median follow up of 54 months, the SALV and NCRS groups had similar rates of 3-year overall survival (43.3% vs. 40.1% ) and disease-free survival (39.2% vs. 32.8%). The two groups also had similar rates tumor recurrence: overall (46.8% vs. 47.9%), locoregional (18.8% vs. 15.9%), distant (24.3% vs. 28.1%) and mixed (13.0% vs. 13.5%).
The SALV and NCRS groups had similar rates of in-hospital mortality (8.4% vs. 9.3%) and morbidity (63.6% vs. 58.9%), but SALV patients had significantly more complications from anastomotic leak (17.2% vs. 10.7%) and surgical site infection (18.5% vs. 12.2%).
Subset analysis of the SALV group showed that patients who received a total radiation dose ≥ 55 Gy (compared with SALV patients who received a lower dose) had significantly increased in-hospital mortality (27.8% vs. 4.3%; P < .001), overall morbidity (75.9% vs. 61%; P = .039), anastomotic leak (27.8% vs. 15%; P = .023), surgical site infection (29.6% vs. 16.1%; P = .02), and pulmonary complications (55.6% vs. 40.2%; P = .038).
“Currently, there is no evidence in terms of locoregional control or survival benefit to support a high total radiation dose (> 50 Gy) in patients receiving dCRT,” according to the researchers, who noted that the findings suggest, “an upper threshold of 50 Gy should be used in these patients to optimize the benefits of dCRT without compromising the safety of SALV, if required.”
Patients who underwent SALV at high-volume centers had significantly lower rates of in-hospital mortality (6.3% vs. 16.2%; P = .009) and overall morbidity (58.8% vs. 80.9%; P = .001) compared with procedures done at low-volume centers.
Compared with recurrent disease, patients with persistent disease after dCRT had poorer long-term prognoses, suggesting a more aggressive tumor biology. Early identification of CRT-resistant tumors to allow early surgical treatment is an important area for future investigation, the investigators said.
For patients with esophageal cancer, salvage surgery after definitive chemoradiotherapy had similar mortality and morbidity rates, and similar survival outcomes, to the combination of neoadjuvant chemoradiation and planned surgery, according to a study published online in the Journal of Clinical Oncology.
Definitive chemoradiotherapy (dCRT) is an alternative to highly invasive surgical resection, which carries a significant rate of morbidity and mortality; however, recent data indicate that 50% of patients with complete response to dCRT experience tumor recurrence.
“Our study demonstrated a similar survival and recurrence pattern for the SALV [salvage surgery after definitive chemoradiotherapy] and NCRS [neoadjuvant chemoradiation and planned surgery] groups, potentially validating an approach of dCRT with reserved SALV for persistent or recurrent disease. Importantly, there were no differences in oncologic safety of surgery, including extent of nodal dissection, between the SALV and NCRS groups,” wrote Dr. Sheraz Markar, a clinical research fellow from Imperial College, London, and colleagues (Journ. Clin. Onc. 2015 July 20 [doi:10.1200/JCO.2014.59.9092]).
The retrospective study compared 308 patients with esophageal cancer who underwent SALV with 540 patients who received NCRS at European centers from 2000 to 2010. After a median follow up of 54 months, the SALV and NCRS groups had similar rates of 3-year overall survival (43.3% vs. 40.1% ) and disease-free survival (39.2% vs. 32.8%). The two groups also had similar rates tumor recurrence: overall (46.8% vs. 47.9%), locoregional (18.8% vs. 15.9%), distant (24.3% vs. 28.1%) and mixed (13.0% vs. 13.5%).
The SALV and NCRS groups had similar rates of in-hospital mortality (8.4% vs. 9.3%) and morbidity (63.6% vs. 58.9%), but SALV patients had significantly more complications from anastomotic leak (17.2% vs. 10.7%) and surgical site infection (18.5% vs. 12.2%).
Subset analysis of the SALV group showed that patients who received a total radiation dose ≥ 55 Gy (compared with SALV patients who received a lower dose) had significantly increased in-hospital mortality (27.8% vs. 4.3%; P < .001), overall morbidity (75.9% vs. 61%; P = .039), anastomotic leak (27.8% vs. 15%; P = .023), surgical site infection (29.6% vs. 16.1%; P = .02), and pulmonary complications (55.6% vs. 40.2%; P = .038).
“Currently, there is no evidence in terms of locoregional control or survival benefit to support a high total radiation dose (> 50 Gy) in patients receiving dCRT,” according to the researchers, who noted that the findings suggest, “an upper threshold of 50 Gy should be used in these patients to optimize the benefits of dCRT without compromising the safety of SALV, if required.”
Patients who underwent SALV at high-volume centers had significantly lower rates of in-hospital mortality (6.3% vs. 16.2%; P = .009) and overall morbidity (58.8% vs. 80.9%; P = .001) compared with procedures done at low-volume centers.
Compared with recurrent disease, patients with persistent disease after dCRT had poorer long-term prognoses, suggesting a more aggressive tumor biology. Early identification of CRT-resistant tumors to allow early surgical treatment is an important area for future investigation, the investigators said.
For patients with esophageal cancer, salvage surgery after definitive chemoradiotherapy had similar mortality and morbidity rates, and similar survival outcomes, to the combination of neoadjuvant chemoradiation and planned surgery, according to a study published online in the Journal of Clinical Oncology.
Definitive chemoradiotherapy (dCRT) is an alternative to highly invasive surgical resection, which carries a significant rate of morbidity and mortality; however, recent data indicate that 50% of patients with complete response to dCRT experience tumor recurrence.
“Our study demonstrated a similar survival and recurrence pattern for the SALV [salvage surgery after definitive chemoradiotherapy] and NCRS [neoadjuvant chemoradiation and planned surgery] groups, potentially validating an approach of dCRT with reserved SALV for persistent or recurrent disease. Importantly, there were no differences in oncologic safety of surgery, including extent of nodal dissection, between the SALV and NCRS groups,” wrote Dr. Sheraz Markar, a clinical research fellow from Imperial College, London, and colleagues (Journ. Clin. Onc. 2015 July 20 [doi:10.1200/JCO.2014.59.9092]).
The retrospective study compared 308 patients with esophageal cancer who underwent SALV with 540 patients who received NCRS at European centers from 2000 to 2010. After a median follow up of 54 months, the SALV and NCRS groups had similar rates of 3-year overall survival (43.3% vs. 40.1% ) and disease-free survival (39.2% vs. 32.8%). The two groups also had similar rates tumor recurrence: overall (46.8% vs. 47.9%), locoregional (18.8% vs. 15.9%), distant (24.3% vs. 28.1%) and mixed (13.0% vs. 13.5%).
The SALV and NCRS groups had similar rates of in-hospital mortality (8.4% vs. 9.3%) and morbidity (63.6% vs. 58.9%), but SALV patients had significantly more complications from anastomotic leak (17.2% vs. 10.7%) and surgical site infection (18.5% vs. 12.2%).
Subset analysis of the SALV group showed that patients who received a total radiation dose ≥ 55 Gy (compared with SALV patients who received a lower dose) had significantly increased in-hospital mortality (27.8% vs. 4.3%; P < .001), overall morbidity (75.9% vs. 61%; P = .039), anastomotic leak (27.8% vs. 15%; P = .023), surgical site infection (29.6% vs. 16.1%; P = .02), and pulmonary complications (55.6% vs. 40.2%; P = .038).
“Currently, there is no evidence in terms of locoregional control or survival benefit to support a high total radiation dose (> 50 Gy) in patients receiving dCRT,” according to the researchers, who noted that the findings suggest, “an upper threshold of 50 Gy should be used in these patients to optimize the benefits of dCRT without compromising the safety of SALV, if required.”
Patients who underwent SALV at high-volume centers had significantly lower rates of in-hospital mortality (6.3% vs. 16.2%; P = .009) and overall morbidity (58.8% vs. 80.9%; P = .001) compared with procedures done at low-volume centers.
Compared with recurrent disease, patients with persistent disease after dCRT had poorer long-term prognoses, suggesting a more aggressive tumor biology. Early identification of CRT-resistant tumors to allow early surgical treatment is an important area for future investigation, the investigators said.
FROM JOURNAL OF CLINICAL ONCOLOGY
Key clinical point: As management for esophageal cancer, salvage esophagectomy after definitive chemoradiotherapy (SALV) produced similar outcomes to the combination of neoadjuvant chemoradiation and planned surgery (NCRS).
Major finding: The SALV and NCRS groups had similar rates of 3-year overall survival (43.3% vs. 40.1% ) and disease-free survival (39.2% vs. 32.8%), tumor recurrence (46.8% vs. 47.9%), and in-hospital mortality (8.4% vs. 9.3%) and morbidity (63.6% vs. 58.9%).
Data source: Retrospective analysis of 848 patients (308 SALV, 540 NCRS) who underwent surgical resection for esophageal cancer in French-speaking European centers from 2000 to 2010.
Disclosures: Dr. Markar reported having no disclosures. Two of his coauthors reported ties to industry.
Dickkopf-3 overexpression linked to tumor traits in esophageal adenocarcinoma
Esophageal adenocarcinoma is fatal to 90% of patients, indicating a profound need for new therapeutic agents, according to Zhuewn Wang and her colleagues.
The search for genes involved in oncogenesis provides one avenue for finding potential targets. Ms. Wang and her colleagues performed a laboratory study at the University of Michigan, Ann Arbor, to examine the results of the overexpression of DKK3 (the gene for the Dickkopf-3 protein [DKK3]) in DKK3-transfected tissue-cultured esophageal adenocarcinoma (EAC) cell lines. They found that DKK3 overexpression correlated with significantly increased proliferation and Matrigel invasion ability – both known to be important oncogenic traits (J. Thorac. Cardiovasc. Surg. 2015;150:377-85).
DKK3 was overexpressed (greater than twofold) in 76% (72/95) of esophageal adenocarcinomas tested. In addition, the DKK3 protein was present at moderate to high levels in 47% (29/62) of esophageal adenocarcinomas as shown by tissue microarray. Nodal metastases were also significantly increased in patients with esophageal adenocarcinomas highly overexpressing DKK3 (28/32) vs. non–highly expressing EAC tumors (42/63).
In vitro studies showed that stable transfection of DKK3 in an EAC cell line significantly increased proliferation and Matrigel invasion. The researchers also found that the levels of SMAD4, a key mediator of the transforming growth factor–beta pathway, increased after activin treatment of the transfected cell line, and siSMAD4 significantly decreased Matrigel invasion, suggesting that DKK3 acts through the transforming growth factor–beta pathway, according to the researchers.
Additionally, the transfected cells showed increased endothelial tube formation, and they were significantly more resistant to 5-fluorouracil and cisplatin. This finding correlates with the fact that DKK3 expression was found to be significantly higher in chemoresistant esophageal adenocarcinomas, the investigators reported.
In their animal model system (NOD/SCIDg mice), injection of the transfected cells resulted in tumors at all sites (8/8), whereas vector-only cells grew in only one of eight sites.
“The results of the current study suggest that DKK3 may play an important role in tumor growth and invasion in EAC. DKK3 is overexpressed in a significant number of esophageal adenocarcinomas and targeting DKK3 and its downstream mediators may be beneficial in the prevention and treatment of micrometastatic disease and potentially decreasing disease recurrence,” the researchers concluded.
The authors reported that they had no relevant financial conflicts.
The number of EAC cell lines currently available for study is limited. Despite the fact that the majority of EAC human tumors overexpress Dickkopf-3 in vivo, none of the cell lines used in this study had significant expression. Therefore the authors had to use a forced overexpression transfection strategy to evaluate EAC tumor biology, according to Dr. David R. Jones (J. Thorac. Cardovasc. Surg. 2015;150:288).
This is less than optimal because of an inability to silence Dickkopf-3 to determine if it is sufficient or merely required to drive tumorigenesis or aggressiveness. A more glaring and overarching problem is the lack of an adequate animal model, Dr. Jones added. The xenograft flank model may not recapitulate EAC tumorigenesis or metastases. “Currently there are no well-characterized EAC genetically engineered mouse models that faithfully reproduce EAC,” he pointed out. “The lack of EAC animal models is a major barrier to the preclinical evaluation of novel therapies, target validation, and pathway discovery and confirmation.”
Dr. Jones is a cardiothoracic surgeon at Memorial Sloan Kettering Cancer Center, New York. He made these remarks in his invited commentary.
The number of EAC cell lines currently available for study is limited. Despite the fact that the majority of EAC human tumors overexpress Dickkopf-3 in vivo, none of the cell lines used in this study had significant expression. Therefore the authors had to use a forced overexpression transfection strategy to evaluate EAC tumor biology, according to Dr. David R. Jones (J. Thorac. Cardovasc. Surg. 2015;150:288).
This is less than optimal because of an inability to silence Dickkopf-3 to determine if it is sufficient or merely required to drive tumorigenesis or aggressiveness. A more glaring and overarching problem is the lack of an adequate animal model, Dr. Jones added. The xenograft flank model may not recapitulate EAC tumorigenesis or metastases. “Currently there are no well-characterized EAC genetically engineered mouse models that faithfully reproduce EAC,” he pointed out. “The lack of EAC animal models is a major barrier to the preclinical evaluation of novel therapies, target validation, and pathway discovery and confirmation.”
Dr. Jones is a cardiothoracic surgeon at Memorial Sloan Kettering Cancer Center, New York. He made these remarks in his invited commentary.
The number of EAC cell lines currently available for study is limited. Despite the fact that the majority of EAC human tumors overexpress Dickkopf-3 in vivo, none of the cell lines used in this study had significant expression. Therefore the authors had to use a forced overexpression transfection strategy to evaluate EAC tumor biology, according to Dr. David R. Jones (J. Thorac. Cardovasc. Surg. 2015;150:288).
This is less than optimal because of an inability to silence Dickkopf-3 to determine if it is sufficient or merely required to drive tumorigenesis or aggressiveness. A more glaring and overarching problem is the lack of an adequate animal model, Dr. Jones added. The xenograft flank model may not recapitulate EAC tumorigenesis or metastases. “Currently there are no well-characterized EAC genetically engineered mouse models that faithfully reproduce EAC,” he pointed out. “The lack of EAC animal models is a major barrier to the preclinical evaluation of novel therapies, target validation, and pathway discovery and confirmation.”
Dr. Jones is a cardiothoracic surgeon at Memorial Sloan Kettering Cancer Center, New York. He made these remarks in his invited commentary.
Esophageal adenocarcinoma is fatal to 90% of patients, indicating a profound need for new therapeutic agents, according to Zhuewn Wang and her colleagues.
The search for genes involved in oncogenesis provides one avenue for finding potential targets. Ms. Wang and her colleagues performed a laboratory study at the University of Michigan, Ann Arbor, to examine the results of the overexpression of DKK3 (the gene for the Dickkopf-3 protein [DKK3]) in DKK3-transfected tissue-cultured esophageal adenocarcinoma (EAC) cell lines. They found that DKK3 overexpression correlated with significantly increased proliferation and Matrigel invasion ability – both known to be important oncogenic traits (J. Thorac. Cardiovasc. Surg. 2015;150:377-85).
DKK3 was overexpressed (greater than twofold) in 76% (72/95) of esophageal adenocarcinomas tested. In addition, the DKK3 protein was present at moderate to high levels in 47% (29/62) of esophageal adenocarcinomas as shown by tissue microarray. Nodal metastases were also significantly increased in patients with esophageal adenocarcinomas highly overexpressing DKK3 (28/32) vs. non–highly expressing EAC tumors (42/63).
In vitro studies showed that stable transfection of DKK3 in an EAC cell line significantly increased proliferation and Matrigel invasion. The researchers also found that the levels of SMAD4, a key mediator of the transforming growth factor–beta pathway, increased after activin treatment of the transfected cell line, and siSMAD4 significantly decreased Matrigel invasion, suggesting that DKK3 acts through the transforming growth factor–beta pathway, according to the researchers.
Additionally, the transfected cells showed increased endothelial tube formation, and they were significantly more resistant to 5-fluorouracil and cisplatin. This finding correlates with the fact that DKK3 expression was found to be significantly higher in chemoresistant esophageal adenocarcinomas, the investigators reported.
In their animal model system (NOD/SCIDg mice), injection of the transfected cells resulted in tumors at all sites (8/8), whereas vector-only cells grew in only one of eight sites.
“The results of the current study suggest that DKK3 may play an important role in tumor growth and invasion in EAC. DKK3 is overexpressed in a significant number of esophageal adenocarcinomas and targeting DKK3 and its downstream mediators may be beneficial in the prevention and treatment of micrometastatic disease and potentially decreasing disease recurrence,” the researchers concluded.
The authors reported that they had no relevant financial conflicts.
Esophageal adenocarcinoma is fatal to 90% of patients, indicating a profound need for new therapeutic agents, according to Zhuewn Wang and her colleagues.
The search for genes involved in oncogenesis provides one avenue for finding potential targets. Ms. Wang and her colleagues performed a laboratory study at the University of Michigan, Ann Arbor, to examine the results of the overexpression of DKK3 (the gene for the Dickkopf-3 protein [DKK3]) in DKK3-transfected tissue-cultured esophageal adenocarcinoma (EAC) cell lines. They found that DKK3 overexpression correlated with significantly increased proliferation and Matrigel invasion ability – both known to be important oncogenic traits (J. Thorac. Cardiovasc. Surg. 2015;150:377-85).
DKK3 was overexpressed (greater than twofold) in 76% (72/95) of esophageal adenocarcinomas tested. In addition, the DKK3 protein was present at moderate to high levels in 47% (29/62) of esophageal adenocarcinomas as shown by tissue microarray. Nodal metastases were also significantly increased in patients with esophageal adenocarcinomas highly overexpressing DKK3 (28/32) vs. non–highly expressing EAC tumors (42/63).
In vitro studies showed that stable transfection of DKK3 in an EAC cell line significantly increased proliferation and Matrigel invasion. The researchers also found that the levels of SMAD4, a key mediator of the transforming growth factor–beta pathway, increased after activin treatment of the transfected cell line, and siSMAD4 significantly decreased Matrigel invasion, suggesting that DKK3 acts through the transforming growth factor–beta pathway, according to the researchers.
Additionally, the transfected cells showed increased endothelial tube formation, and they were significantly more resistant to 5-fluorouracil and cisplatin. This finding correlates with the fact that DKK3 expression was found to be significantly higher in chemoresistant esophageal adenocarcinomas, the investigators reported.
In their animal model system (NOD/SCIDg mice), injection of the transfected cells resulted in tumors at all sites (8/8), whereas vector-only cells grew in only one of eight sites.
“The results of the current study suggest that DKK3 may play an important role in tumor growth and invasion in EAC. DKK3 is overexpressed in a significant number of esophageal adenocarcinomas and targeting DKK3 and its downstream mediators may be beneficial in the prevention and treatment of micrometastatic disease and potentially decreasing disease recurrence,” the researchers concluded.
The authors reported that they had no relevant financial conflicts.
FROM THE JOURNAL OF THORACIC AND CARDIOVASCULAR SURGERY
Key clinical point: DKK3 may be important in mediating invasion in esophageal adenocarcinoma and could be a novel target for treating and preventing metastasis.
Major finding: Dickkopf-3 overexpression correlated with tumorigenesis and aggressiveness traits in an esophageal adenocarcinoma cell line, and the gene was overexpressed in primary EAC tumors relative to normal esophageal tissue.
Data source: Laboratory studies using cell culture and a mouse model.
Disclosures: The authors reported that they had no relevant financial conflicts.
Talent: Too much or too little depends on the kind of team
Talent facilitates team performance, but only up to a point, depending on the type of team. For highly interdependent teams, there is a threshold where the benefits of more talent decrease and eventually become detrimental rather than beneficial, according to an analysis of five studies performed by Roderick I. Swaab and his colleagues.
Surveys across industries and countries show that organizations consider talent attraction is their top priority, presumably based on the belief that more talent is better and the relationship between talent and team performance is linear and monotonic, they stated in Psychological Science (2014;25:1581-91)
The researchers analyzed the results of five studies they performed to investigate this relationship, comparing the impact of talent on team performance in the highly interdependent sports soccer (World Cup performance) and basketball (NBA), compared with the less interdependent sport of baseball (MLB).
In the case of soccer, team performance data were based on the average Fédération Internationale de Football Association (FIFA) rankings of national teams during the 2010 and 2014 World Cup qualification periods. Top talent was assessed by dividing the team’s numbers of players in each national team active in one of the world’s elite clubs divided by the total number of players on the national team.
In the case of the NBA, top talent was determined using the individual players’ estimated wins added (EWA) as determined over a 10-year period (2002-2012), with an index determining whether a player was in the top one-third (1) or not (0). Team performance was measured using each team’s end-of-year win percentage.
For baseball, top talent was determined using a player’s wins above replacement statistic (WAR), which is the number of wins a player contributes relative to a freely available minor-league player. Team performance was measured using each team’s win percentage.
In their results, basketball and soccer, the two most interdependent of the three sports, both showed a significant quadratic effect in which top talent benefited performance only up to a point, after which the marginal benefit of talent decreased and turned negative.
These results were in contrast to those seen with baseball. “As we predicted, the effect of top talent never turned negative in baseball, a sport in which task interdependence is relatively low. Thus there was no too-much-talent effect in baseball unlike in football [soccer] and basketball” according to the researchers.
“We predict that the too-much-talent effect will be found in other organizational contexts as well,” they added.
“Just as a colony of high-performing chickens competing for dominance suffers decrements in overall egg production and increases in bird mortality, teams with too much talent appear to divert attention away from coordination as team members peck at each other in their attempts to establish intergroup standing. In many cases, too much talent can be the seed of failure,” Mr. Swaab and his colleagues concluded.
The authors all declared that they had no conflicts of interest relative to the paper.
Mr. Swaab and his colleagues hypothesized the “too-much-talent effect” and found that talent often facilitates team performance, but only up to a point. They report that the relationship between talent and performance is not linear and monotonic. In contrast, they found that the relationship in football and basketball eventually turns negative (Psychol. Science 2014;25;1581-91). We have all heard the phrase, “There is no ‘I’ in team,” and it is clear that in certain sports, a team is necessary. No single cyclist can win the Tour de France without a strong team of support, and a soccer striker who is arrogant and feels he or she can score alone will fail to pass to the open teammate time and time again. We have all seen this strategy fail to result in a win. In contrast, one role of the midfielder in soccer is to provide selfless service to the striker – for the benefit of the team.
The cardiothoracic surgery world that we all work in can benefit from the findings reported by Swaab and colleagues. The individual with the “talent” in the cardiothoracic surgery realm – the skilled surgeon – may not be the best at building teams that work well together, and putting together several talented surgeons does not make an outstanding surgical team. Surgeons are not trained to be leaders, team builders, or to model collaborative behavior. A good leader requires emotional intelligence to be aware of others around her and of the successes and failures of the team. Lynda Gratton and Tamara J. Erickson found that the most productive, innovative teams were led by people who were both task- and relationship-oriented and that these leaders changed their style during the project (Eight Ways to Build Collaborative Teams, Harvard Bushiness Review, November 2007). Their findings were similar to those reported by Mr. Swaab and his colleagues in that they also found that the greater the proportion of highly educated specialists on a team, the more likely the team is to disintegrate into unproductive conflicts.
If we wish to establish world-renowned service lines or surgical programs, we would be well served by understanding that a great team requires collaboration, cohesion, and diversity. Diversity not only in talent but also in background and experiences. In addition, surgeon leaders cannot score all of the goals alone; they need an integrated, cohesive, diverse, and collaborative team to provide excellent patient care.
Dr. Jennifer S. Lawton is a professor of surgery at the division of cardiothoracic surgery, Washington University, St. Louis. She is also an associate medical editor for Vascular Specialist.
Mr. Swaab and his colleagues hypothesized the “too-much-talent effect” and found that talent often facilitates team performance, but only up to a point. They report that the relationship between talent and performance is not linear and monotonic. In contrast, they found that the relationship in football and basketball eventually turns negative (Psychol. Science 2014;25;1581-91). We have all heard the phrase, “There is no ‘I’ in team,” and it is clear that in certain sports, a team is necessary. No single cyclist can win the Tour de France without a strong team of support, and a soccer striker who is arrogant and feels he or she can score alone will fail to pass to the open teammate time and time again. We have all seen this strategy fail to result in a win. In contrast, one role of the midfielder in soccer is to provide selfless service to the striker – for the benefit of the team.
The cardiothoracic surgery world that we all work in can benefit from the findings reported by Swaab and colleagues. The individual with the “talent” in the cardiothoracic surgery realm – the skilled surgeon – may not be the best at building teams that work well together, and putting together several talented surgeons does not make an outstanding surgical team. Surgeons are not trained to be leaders, team builders, or to model collaborative behavior. A good leader requires emotional intelligence to be aware of others around her and of the successes and failures of the team. Lynda Gratton and Tamara J. Erickson found that the most productive, innovative teams were led by people who were both task- and relationship-oriented and that these leaders changed their style during the project (Eight Ways to Build Collaborative Teams, Harvard Bushiness Review, November 2007). Their findings were similar to those reported by Mr. Swaab and his colleagues in that they also found that the greater the proportion of highly educated specialists on a team, the more likely the team is to disintegrate into unproductive conflicts.
If we wish to establish world-renowned service lines or surgical programs, we would be well served by understanding that a great team requires collaboration, cohesion, and diversity. Diversity not only in talent but also in background and experiences. In addition, surgeon leaders cannot score all of the goals alone; they need an integrated, cohesive, diverse, and collaborative team to provide excellent patient care.
Dr. Jennifer S. Lawton is a professor of surgery at the division of cardiothoracic surgery, Washington University, St. Louis. She is also an associate medical editor for Vascular Specialist.
Mr. Swaab and his colleagues hypothesized the “too-much-talent effect” and found that talent often facilitates team performance, but only up to a point. They report that the relationship between talent and performance is not linear and monotonic. In contrast, they found that the relationship in football and basketball eventually turns negative (Psychol. Science 2014;25;1581-91). We have all heard the phrase, “There is no ‘I’ in team,” and it is clear that in certain sports, a team is necessary. No single cyclist can win the Tour de France without a strong team of support, and a soccer striker who is arrogant and feels he or she can score alone will fail to pass to the open teammate time and time again. We have all seen this strategy fail to result in a win. In contrast, one role of the midfielder in soccer is to provide selfless service to the striker – for the benefit of the team.
The cardiothoracic surgery world that we all work in can benefit from the findings reported by Swaab and colleagues. The individual with the “talent” in the cardiothoracic surgery realm – the skilled surgeon – may not be the best at building teams that work well together, and putting together several talented surgeons does not make an outstanding surgical team. Surgeons are not trained to be leaders, team builders, or to model collaborative behavior. A good leader requires emotional intelligence to be aware of others around her and of the successes and failures of the team. Lynda Gratton and Tamara J. Erickson found that the most productive, innovative teams were led by people who were both task- and relationship-oriented and that these leaders changed their style during the project (Eight Ways to Build Collaborative Teams, Harvard Bushiness Review, November 2007). Their findings were similar to those reported by Mr. Swaab and his colleagues in that they also found that the greater the proportion of highly educated specialists on a team, the more likely the team is to disintegrate into unproductive conflicts.
If we wish to establish world-renowned service lines or surgical programs, we would be well served by understanding that a great team requires collaboration, cohesion, and diversity. Diversity not only in talent but also in background and experiences. In addition, surgeon leaders cannot score all of the goals alone; they need an integrated, cohesive, diverse, and collaborative team to provide excellent patient care.
Dr. Jennifer S. Lawton is a professor of surgery at the division of cardiothoracic surgery, Washington University, St. Louis. She is also an associate medical editor for Vascular Specialist.
Talent facilitates team performance, but only up to a point, depending on the type of team. For highly interdependent teams, there is a threshold where the benefits of more talent decrease and eventually become detrimental rather than beneficial, according to an analysis of five studies performed by Roderick I. Swaab and his colleagues.
Surveys across industries and countries show that organizations consider talent attraction is their top priority, presumably based on the belief that more talent is better and the relationship between talent and team performance is linear and monotonic, they stated in Psychological Science (2014;25:1581-91)
The researchers analyzed the results of five studies they performed to investigate this relationship, comparing the impact of talent on team performance in the highly interdependent sports soccer (World Cup performance) and basketball (NBA), compared with the less interdependent sport of baseball (MLB).
In the case of soccer, team performance data were based on the average Fédération Internationale de Football Association (FIFA) rankings of national teams during the 2010 and 2014 World Cup qualification periods. Top talent was assessed by dividing the team’s numbers of players in each national team active in one of the world’s elite clubs divided by the total number of players on the national team.
In the case of the NBA, top talent was determined using the individual players’ estimated wins added (EWA) as determined over a 10-year period (2002-2012), with an index determining whether a player was in the top one-third (1) or not (0). Team performance was measured using each team’s end-of-year win percentage.
For baseball, top talent was determined using a player’s wins above replacement statistic (WAR), which is the number of wins a player contributes relative to a freely available minor-league player. Team performance was measured using each team’s win percentage.
In their results, basketball and soccer, the two most interdependent of the three sports, both showed a significant quadratic effect in which top talent benefited performance only up to a point, after which the marginal benefit of talent decreased and turned negative.
These results were in contrast to those seen with baseball. “As we predicted, the effect of top talent never turned negative in baseball, a sport in which task interdependence is relatively low. Thus there was no too-much-talent effect in baseball unlike in football [soccer] and basketball” according to the researchers.
“We predict that the too-much-talent effect will be found in other organizational contexts as well,” they added.
“Just as a colony of high-performing chickens competing for dominance suffers decrements in overall egg production and increases in bird mortality, teams with too much talent appear to divert attention away from coordination as team members peck at each other in their attempts to establish intergroup standing. In many cases, too much talent can be the seed of failure,” Mr. Swaab and his colleagues concluded.
The authors all declared that they had no conflicts of interest relative to the paper.
Talent facilitates team performance, but only up to a point, depending on the type of team. For highly interdependent teams, there is a threshold where the benefits of more talent decrease and eventually become detrimental rather than beneficial, according to an analysis of five studies performed by Roderick I. Swaab and his colleagues.
Surveys across industries and countries show that organizations consider talent attraction is their top priority, presumably based on the belief that more talent is better and the relationship between talent and team performance is linear and monotonic, they stated in Psychological Science (2014;25:1581-91)
The researchers analyzed the results of five studies they performed to investigate this relationship, comparing the impact of talent on team performance in the highly interdependent sports soccer (World Cup performance) and basketball (NBA), compared with the less interdependent sport of baseball (MLB).
In the case of soccer, team performance data were based on the average Fédération Internationale de Football Association (FIFA) rankings of national teams during the 2010 and 2014 World Cup qualification periods. Top talent was assessed by dividing the team’s numbers of players in each national team active in one of the world’s elite clubs divided by the total number of players on the national team.
In the case of the NBA, top talent was determined using the individual players’ estimated wins added (EWA) as determined over a 10-year period (2002-2012), with an index determining whether a player was in the top one-third (1) or not (0). Team performance was measured using each team’s end-of-year win percentage.
For baseball, top talent was determined using a player’s wins above replacement statistic (WAR), which is the number of wins a player contributes relative to a freely available minor-league player. Team performance was measured using each team’s win percentage.
In their results, basketball and soccer, the two most interdependent of the three sports, both showed a significant quadratic effect in which top talent benefited performance only up to a point, after which the marginal benefit of talent decreased and turned negative.
These results were in contrast to those seen with baseball. “As we predicted, the effect of top talent never turned negative in baseball, a sport in which task interdependence is relatively low. Thus there was no too-much-talent effect in baseball unlike in football [soccer] and basketball” according to the researchers.
“We predict that the too-much-talent effect will be found in other organizational contexts as well,” they added.
“Just as a colony of high-performing chickens competing for dominance suffers decrements in overall egg production and increases in bird mortality, teams with too much talent appear to divert attention away from coordination as team members peck at each other in their attempts to establish intergroup standing. In many cases, too much talent can be the seed of failure,” Mr. Swaab and his colleagues concluded.
The authors all declared that they had no conflicts of interest relative to the paper.
FROM PSYCHOLOGICAL SCIENCE
Key clinical point: Talent facilitates team performance, but it’s only up to a point, depending on the type of team.
Major finding: For highly interdependent teams, there is a threshold where the benefits of more talent decrease and eventually become detrimental rather than beneficial.
Data source: Five studies compared the impact of talent on team performance in the highly interdependent sports soccer (World Cup performance) and basketball (NBA), compared with the less interdependent sport of baseball (MLB)
Disclosures: The authors of the study reported no relevant disclosures.
CABG costs more in patients with diabetes
The rate of diabetic coronary artery bypass graft patients has increased more than fivefold in recent decades, and these patients are more likely to have worse outcomes and higher treatment costs, a study showed.
The percentage of patients who had diabetes among all those undergoing coronary artery bypass grafting (CABG) increased from 7% in the 1970s to 37% in the 2000s, according to a database study of 55,501 patients operated on at the Cleveland Clinic.
Patients were identified and preoperative, operative, and postoperative variables were identified, resulting in 45,139 nondiabetic patients assessed and 10,362 diabetic patients (defined as those diabetic patients pharmacologically treated with either insulin or an oral agent) evaluated. The endpoints assessed were in-hospital adverse outcomes as determined by the Society of Thoracic Surgeons National Database, in-hospital direct technical costs, and time-related mortality, according to Dr. Sajjad Raza and his colleagues at the Cleveland Clinic in the August issue of the Journal of Thoracic and Cardiovascular Surgery (150:294-301).
Compared with nondiabetics, diabetic patients undergoing CABG were older and were more likely to be overweight, to be women, and to have a history of heart failure, peripheral arterial disease, carotid disease, hypertension, renal failure, stroke, and advanced coronary artery disease. Over time, the cardiovascular risk profile of the entire population changed, becoming even more pronounced for all patients, but more so for diabetics.
Overall long-term survival at 6 months and at 1, 5 10, 15, and 20 years for diabetic patients was 95%, 94%, 80%, 54%, 31%, and 18%, respectively, compared with 97%, 97%, 90%, 76%, 59%, and 42% for nondiabetic patients, a significant difference at P <.0001.
Propensity matching of similar diabetic and nondiabetic patients showed that deep sternal wound infection and stroke occurred significantly more often in diabetics, although there were no significant differences in cost remaining after matching, even though the length of stay greater than 14 days remained higher for diabetic patients.
Among diabetics, overall survival at 6 months and at 1, 5, 10, 15, and 20 years after CABG was 95%, 94%, 80%, 54%, 31%, and 18%, respectively, compared with overall survival in nondiabetics at 97%, 97%, 90%, 76%, 59%, and 42%, respectively, a significant difference (P <.0001).
“Although long-term survival after CABG is worse in diabetics and high-risk nondiabetics, it is important to note that, in general, high-risk patients reap the greatest survival benefit from CABG. Moreover, using surgical techniques that are associated with better long-term survival after CABG in diabetics could further enhance this survival benefit,” Dr. Raza and his colleagues wrote.
“Diabetes is both a marker for high-risk, resource-intensive, and expensive care after CABG and an independent risk factor for reduced long-term survival,” they added. “Diabetic patients and those with a similar high-risk profile set to undergo CABG should be made aware that their risks of postoperative complications are higher than average, and measures should be taken to reduce their postoperative complications,” Dr. Raza and his colleagues concluded.
The authors reported that they had no relevant conflicts of interest.
Patients with diabetes, with or without metabolic syndrome, represent an increasing challenge for cardiac surgery. CABG has been shown to convey a mortality benefit in such patients who also have multivessel disease. This study confirms what most clinicians already know – that the outcomes of patients with diabetes are worse than those in nondiabetic patients, according to Dr. Mani Arsalan and Dr. Michael Mack. “What is particularly important about this study, however, is that it is a single institutional experience with known surgical excellence and a very meticulous and complete outcomes database,” they wrote (J. Thorac. Cardiovasc. Surg. 2015;150:284-5).
Given their findings and the fact that CABG can be expected to remain the mainstay of treatment of multivessel disease in diabetics because of the results of the FREEDOM (Future Revascularization Evaluation in Patients With Diabetes Mellitus: Optimal Management of Multivessel Disease) trial, surgeons should pay increased attention to the details of the procedure for these patients. There should be an increased use of bilateral internal mammary arteries, which has been distressingly low, and yet can provide a 23% mortality benefit. “Two arteries are better than one.” Despite the increased risk of deep sternal infection, “the use of skeletonized bilateral internal mammary arteries in young, nonobese diabetic patients with a greater than 10-year life expectancy seems a reasonable risk to take,” Dr. Arsalan and Dr. Mack wrote. In addition, where possible, reaching satisfactory glycemic control before surgery can help decrease early complications. “The weight may be increasingly on our patients, but the real weight is on us as surgeons to help improve their early and long-term survival,” they concluded.
Dr. Arsalan and Dr. Mack are cardiovascular surgeons at Baylor Scott & White Health, Dallas. Their remarks were part of an invited commentary published with the paper.
Patients with diabetes, with or without metabolic syndrome, represent an increasing challenge for cardiac surgery. CABG has been shown to convey a mortality benefit in such patients who also have multivessel disease. This study confirms what most clinicians already know – that the outcomes of patients with diabetes are worse than those in nondiabetic patients, according to Dr. Mani Arsalan and Dr. Michael Mack. “What is particularly important about this study, however, is that it is a single institutional experience with known surgical excellence and a very meticulous and complete outcomes database,” they wrote (J. Thorac. Cardiovasc. Surg. 2015;150:284-5).
Given their findings and the fact that CABG can be expected to remain the mainstay of treatment of multivessel disease in diabetics because of the results of the FREEDOM (Future Revascularization Evaluation in Patients With Diabetes Mellitus: Optimal Management of Multivessel Disease) trial, surgeons should pay increased attention to the details of the procedure for these patients. There should be an increased use of bilateral internal mammary arteries, which has been distressingly low, and yet can provide a 23% mortality benefit. “Two arteries are better than one.” Despite the increased risk of deep sternal infection, “the use of skeletonized bilateral internal mammary arteries in young, nonobese diabetic patients with a greater than 10-year life expectancy seems a reasonable risk to take,” Dr. Arsalan and Dr. Mack wrote. In addition, where possible, reaching satisfactory glycemic control before surgery can help decrease early complications. “The weight may be increasingly on our patients, but the real weight is on us as surgeons to help improve their early and long-term survival,” they concluded.
Dr. Arsalan and Dr. Mack are cardiovascular surgeons at Baylor Scott & White Health, Dallas. Their remarks were part of an invited commentary published with the paper.
Patients with diabetes, with or without metabolic syndrome, represent an increasing challenge for cardiac surgery. CABG has been shown to convey a mortality benefit in such patients who also have multivessel disease. This study confirms what most clinicians already know – that the outcomes of patients with diabetes are worse than those in nondiabetic patients, according to Dr. Mani Arsalan and Dr. Michael Mack. “What is particularly important about this study, however, is that it is a single institutional experience with known surgical excellence and a very meticulous and complete outcomes database,” they wrote (J. Thorac. Cardiovasc. Surg. 2015;150:284-5).
Given their findings and the fact that CABG can be expected to remain the mainstay of treatment of multivessel disease in diabetics because of the results of the FREEDOM (Future Revascularization Evaluation in Patients With Diabetes Mellitus: Optimal Management of Multivessel Disease) trial, surgeons should pay increased attention to the details of the procedure for these patients. There should be an increased use of bilateral internal mammary arteries, which has been distressingly low, and yet can provide a 23% mortality benefit. “Two arteries are better than one.” Despite the increased risk of deep sternal infection, “the use of skeletonized bilateral internal mammary arteries in young, nonobese diabetic patients with a greater than 10-year life expectancy seems a reasonable risk to take,” Dr. Arsalan and Dr. Mack wrote. In addition, where possible, reaching satisfactory glycemic control before surgery can help decrease early complications. “The weight may be increasingly on our patients, but the real weight is on us as surgeons to help improve their early and long-term survival,” they concluded.
Dr. Arsalan and Dr. Mack are cardiovascular surgeons at Baylor Scott & White Health, Dallas. Their remarks were part of an invited commentary published with the paper.
The rate of diabetic coronary artery bypass graft patients has increased more than fivefold in recent decades, and these patients are more likely to have worse outcomes and higher treatment costs, a study showed.
The percentage of patients who had diabetes among all those undergoing coronary artery bypass grafting (CABG) increased from 7% in the 1970s to 37% in the 2000s, according to a database study of 55,501 patients operated on at the Cleveland Clinic.
Patients were identified and preoperative, operative, and postoperative variables were identified, resulting in 45,139 nondiabetic patients assessed and 10,362 diabetic patients (defined as those diabetic patients pharmacologically treated with either insulin or an oral agent) evaluated. The endpoints assessed were in-hospital adverse outcomes as determined by the Society of Thoracic Surgeons National Database, in-hospital direct technical costs, and time-related mortality, according to Dr. Sajjad Raza and his colleagues at the Cleveland Clinic in the August issue of the Journal of Thoracic and Cardiovascular Surgery (150:294-301).
Compared with nondiabetics, diabetic patients undergoing CABG were older and were more likely to be overweight, to be women, and to have a history of heart failure, peripheral arterial disease, carotid disease, hypertension, renal failure, stroke, and advanced coronary artery disease. Over time, the cardiovascular risk profile of the entire population changed, becoming even more pronounced for all patients, but more so for diabetics.
Overall long-term survival at 6 months and at 1, 5 10, 15, and 20 years for diabetic patients was 95%, 94%, 80%, 54%, 31%, and 18%, respectively, compared with 97%, 97%, 90%, 76%, 59%, and 42% for nondiabetic patients, a significant difference at P <.0001.
Propensity matching of similar diabetic and nondiabetic patients showed that deep sternal wound infection and stroke occurred significantly more often in diabetics, although there were no significant differences in cost remaining after matching, even though the length of stay greater than 14 days remained higher for diabetic patients.
Among diabetics, overall survival at 6 months and at 1, 5, 10, 15, and 20 years after CABG was 95%, 94%, 80%, 54%, 31%, and 18%, respectively, compared with overall survival in nondiabetics at 97%, 97%, 90%, 76%, 59%, and 42%, respectively, a significant difference (P <.0001).
“Although long-term survival after CABG is worse in diabetics and high-risk nondiabetics, it is important to note that, in general, high-risk patients reap the greatest survival benefit from CABG. Moreover, using surgical techniques that are associated with better long-term survival after CABG in diabetics could further enhance this survival benefit,” Dr. Raza and his colleagues wrote.
“Diabetes is both a marker for high-risk, resource-intensive, and expensive care after CABG and an independent risk factor for reduced long-term survival,” they added. “Diabetic patients and those with a similar high-risk profile set to undergo CABG should be made aware that their risks of postoperative complications are higher than average, and measures should be taken to reduce their postoperative complications,” Dr. Raza and his colleagues concluded.
The authors reported that they had no relevant conflicts of interest.
The rate of diabetic coronary artery bypass graft patients has increased more than fivefold in recent decades, and these patients are more likely to have worse outcomes and higher treatment costs, a study showed.
The percentage of patients who had diabetes among all those undergoing coronary artery bypass grafting (CABG) increased from 7% in the 1970s to 37% in the 2000s, according to a database study of 55,501 patients operated on at the Cleveland Clinic.
Patients were identified and preoperative, operative, and postoperative variables were identified, resulting in 45,139 nondiabetic patients assessed and 10,362 diabetic patients (defined as those diabetic patients pharmacologically treated with either insulin or an oral agent) evaluated. The endpoints assessed were in-hospital adverse outcomes as determined by the Society of Thoracic Surgeons National Database, in-hospital direct technical costs, and time-related mortality, according to Dr. Sajjad Raza and his colleagues at the Cleveland Clinic in the August issue of the Journal of Thoracic and Cardiovascular Surgery (150:294-301).
Compared with nondiabetics, diabetic patients undergoing CABG were older and were more likely to be overweight, to be women, and to have a history of heart failure, peripheral arterial disease, carotid disease, hypertension, renal failure, stroke, and advanced coronary artery disease. Over time, the cardiovascular risk profile of the entire population changed, becoming even more pronounced for all patients, but more so for diabetics.
Overall long-term survival at 6 months and at 1, 5 10, 15, and 20 years for diabetic patients was 95%, 94%, 80%, 54%, 31%, and 18%, respectively, compared with 97%, 97%, 90%, 76%, 59%, and 42% for nondiabetic patients, a significant difference at P <.0001.
Propensity matching of similar diabetic and nondiabetic patients showed that deep sternal wound infection and stroke occurred significantly more often in diabetics, although there were no significant differences in cost remaining after matching, even though the length of stay greater than 14 days remained higher for diabetic patients.
Among diabetics, overall survival at 6 months and at 1, 5, 10, 15, and 20 years after CABG was 95%, 94%, 80%, 54%, 31%, and 18%, respectively, compared with overall survival in nondiabetics at 97%, 97%, 90%, 76%, 59%, and 42%, respectively, a significant difference (P <.0001).
“Although long-term survival after CABG is worse in diabetics and high-risk nondiabetics, it is important to note that, in general, high-risk patients reap the greatest survival benefit from CABG. Moreover, using surgical techniques that are associated with better long-term survival after CABG in diabetics could further enhance this survival benefit,” Dr. Raza and his colleagues wrote.
“Diabetes is both a marker for high-risk, resource-intensive, and expensive care after CABG and an independent risk factor for reduced long-term survival,” they added. “Diabetic patients and those with a similar high-risk profile set to undergo CABG should be made aware that their risks of postoperative complications are higher than average, and measures should be taken to reduce their postoperative complications,” Dr. Raza and his colleagues concluded.
The authors reported that they had no relevant conflicts of interest.
FROM JOURNAL OF THORACIC AND CARDIOVASCULAR SURGERY
Key clinical point: The percentage of CABG patients with diabetes increased from 7% in the 1970s to 37% in the 2000s. The risk/benefit ratio warrants greater use of bilateral mammary arteries except in obese women with diabetes.
Major finding: Diabetic patients had significantly worse outcomes than nondiabetics with regard to hospital death, deep sternal wound infections, strokes, and renal failure as well as hospital stay and costs.
Data source: A retrospective analysis of a prospective database of patients undergoing first-time CABG at the Cleveland Clinic from 1972 to 2011.
Disclosures: The authors reported that they had no relevant conflicts of interest.
SVS: Cryoallografts, NAIS best to reconstruct infected aortic grafts
CHICAGO – Neoaortoiliac system reconstruction or cryopreserved allografts should be used first-line to replace infected aortic endografts, followed by antibiotic-soaked prosthetic grafts, according to a review of 206 cases from vascular surgery centers across the United States.
In the study, 75 patients had cryoallograft reconstruction, mostly cryoartery, or neoaortoiliac system (NAIS) reconstruction with femoral vein; 53 (71%) were alive at 5 years. Forty-nine of the 92 (53%) patients reconstructed with antibiotic-soaked prosthetic grafts also made it to 5 years, while two of 19 (11%) reconstructed with untreated prosthetic grafts survived that long.
The cases were reconstructed inline. The index procedure had been endovascular aortic repair (EVAR) in 165, and thoracic EVAR (TEVAR) in 21, with a better 5-year survival for the EVAR group. Another 11 patients had extra-anatomic reconstruction after initial EVAR, and their 5-year survival was comparable to inline reconstruction at about 50% overall. Nine other patients were managed medically; the majority died soon after being diagnosed with an infected graft.
“Clinicians should have a high index of suspicion to diagnose symptomatic postop EVAR and TEVAR patients with graft infection, especially in those patients with chronic infections or contaminated index procedures. NAIS and cryopreserved allografts require longer procedure times” – about 500 minutes versus about 350 minutes for prosthetic grafts – “but offer improved survival, while prosthetics soaked in antibiotic do better than prosthetic grafts alone,” said lead investigator Dr. Audra Duncan, professor of surgery at the Mayo Clinic in Rochester, Minn.
Patients were treated at the Mayo Clinic, the Cleveland Clinic in Ohio, Johns Hopkins University in Baltimore, the University of California, Los Angeles, and 15 other vascular centers around the country during 2004-2014. They were 68 years old on average, and 78% were men. Comorbidities included hypertension in 84%, smoking in 58%, and renal insufficiency in 30%.
On multivariate analysis, chronic infection, polymicrobial infection, and prosthetic reconstruction, among other things, predicted mortality after reconstruction.
Graft infections were primarily polymicrobial and fungal, and were diagnosed a mean of 716 days after the initial implant, generally by CT. Symptoms included pain in 66%, mostly in the back and abdomen, and fever and chills, also in 66%. Streptococcus, Escherichia coli, and both methicillin-sensitive and -resistant Staphylococcus aureus were among the most commonly isolated organisms. No particular type of graft seemed more likely to get infected.
The sources of infection are unknown, but index procedures were complicated by urinary tract, groin, and other infections in about one-third of patients. About one-third also had interval procedures, including endoleak intervention. About 14% of patients were thought to have had a contaminated index procedure.
Patients stayed in the hospital a mean of 24 days after reconstruction. Early complications included persistent sepsis in 27 patients, myocardial infarction in 9, recurrent infection in 9, and pneumonia in 8. Mortality at 30 days was 11%.
Nineteen replacement grafts – mostly unsoaked Dacron – were explanted after a mean of 540 days. Persistent sepsis after reconstruction was associated with unsoaked Dacron and polytetrafluoroethylene (PTFE) grafts.
To prevent graft infections, Mayo patients “take an antibiotic for any invasive procedure,” including dental work. “I am not sure we have data to support that, but it is something we do,” Dr. Duncan said.
Dr. Duncan has no relevant disclosures.
CHICAGO – Neoaortoiliac system reconstruction or cryopreserved allografts should be used first-line to replace infected aortic endografts, followed by antibiotic-soaked prosthetic grafts, according to a review of 206 cases from vascular surgery centers across the United States.
In the study, 75 patients had cryoallograft reconstruction, mostly cryoartery, or neoaortoiliac system (NAIS) reconstruction with femoral vein; 53 (71%) were alive at 5 years. Forty-nine of the 92 (53%) patients reconstructed with antibiotic-soaked prosthetic grafts also made it to 5 years, while two of 19 (11%) reconstructed with untreated prosthetic grafts survived that long.
The cases were reconstructed inline. The index procedure had been endovascular aortic repair (EVAR) in 165, and thoracic EVAR (TEVAR) in 21, with a better 5-year survival for the EVAR group. Another 11 patients had extra-anatomic reconstruction after initial EVAR, and their 5-year survival was comparable to inline reconstruction at about 50% overall. Nine other patients were managed medically; the majority died soon after being diagnosed with an infected graft.
“Clinicians should have a high index of suspicion to diagnose symptomatic postop EVAR and TEVAR patients with graft infection, especially in those patients with chronic infections or contaminated index procedures. NAIS and cryopreserved allografts require longer procedure times” – about 500 minutes versus about 350 minutes for prosthetic grafts – “but offer improved survival, while prosthetics soaked in antibiotic do better than prosthetic grafts alone,” said lead investigator Dr. Audra Duncan, professor of surgery at the Mayo Clinic in Rochester, Minn.
Patients were treated at the Mayo Clinic, the Cleveland Clinic in Ohio, Johns Hopkins University in Baltimore, the University of California, Los Angeles, and 15 other vascular centers around the country during 2004-2014. They were 68 years old on average, and 78% were men. Comorbidities included hypertension in 84%, smoking in 58%, and renal insufficiency in 30%.
On multivariate analysis, chronic infection, polymicrobial infection, and prosthetic reconstruction, among other things, predicted mortality after reconstruction.
Graft infections were primarily polymicrobial and fungal, and were diagnosed a mean of 716 days after the initial implant, generally by CT. Symptoms included pain in 66%, mostly in the back and abdomen, and fever and chills, also in 66%. Streptococcus, Escherichia coli, and both methicillin-sensitive and -resistant Staphylococcus aureus were among the most commonly isolated organisms. No particular type of graft seemed more likely to get infected.
The sources of infection are unknown, but index procedures were complicated by urinary tract, groin, and other infections in about one-third of patients. About one-third also had interval procedures, including endoleak intervention. About 14% of patients were thought to have had a contaminated index procedure.
Patients stayed in the hospital a mean of 24 days after reconstruction. Early complications included persistent sepsis in 27 patients, myocardial infarction in 9, recurrent infection in 9, and pneumonia in 8. Mortality at 30 days was 11%.
Nineteen replacement grafts – mostly unsoaked Dacron – were explanted after a mean of 540 days. Persistent sepsis after reconstruction was associated with unsoaked Dacron and polytetrafluoroethylene (PTFE) grafts.
To prevent graft infections, Mayo patients “take an antibiotic for any invasive procedure,” including dental work. “I am not sure we have data to support that, but it is something we do,” Dr. Duncan said.
Dr. Duncan has no relevant disclosures.
CHICAGO – Neoaortoiliac system reconstruction or cryopreserved allografts should be used first-line to replace infected aortic endografts, followed by antibiotic-soaked prosthetic grafts, according to a review of 206 cases from vascular surgery centers across the United States.
In the study, 75 patients had cryoallograft reconstruction, mostly cryoartery, or neoaortoiliac system (NAIS) reconstruction with femoral vein; 53 (71%) were alive at 5 years. Forty-nine of the 92 (53%) patients reconstructed with antibiotic-soaked prosthetic grafts also made it to 5 years, while two of 19 (11%) reconstructed with untreated prosthetic grafts survived that long.
The cases were reconstructed inline. The index procedure had been endovascular aortic repair (EVAR) in 165, and thoracic EVAR (TEVAR) in 21, with a better 5-year survival for the EVAR group. Another 11 patients had extra-anatomic reconstruction after initial EVAR, and their 5-year survival was comparable to inline reconstruction at about 50% overall. Nine other patients were managed medically; the majority died soon after being diagnosed with an infected graft.
“Clinicians should have a high index of suspicion to diagnose symptomatic postop EVAR and TEVAR patients with graft infection, especially in those patients with chronic infections or contaminated index procedures. NAIS and cryopreserved allografts require longer procedure times” – about 500 minutes versus about 350 minutes for prosthetic grafts – “but offer improved survival, while prosthetics soaked in antibiotic do better than prosthetic grafts alone,” said lead investigator Dr. Audra Duncan, professor of surgery at the Mayo Clinic in Rochester, Minn.
Patients were treated at the Mayo Clinic, the Cleveland Clinic in Ohio, Johns Hopkins University in Baltimore, the University of California, Los Angeles, and 15 other vascular centers around the country during 2004-2014. They were 68 years old on average, and 78% were men. Comorbidities included hypertension in 84%, smoking in 58%, and renal insufficiency in 30%.
On multivariate analysis, chronic infection, polymicrobial infection, and prosthetic reconstruction, among other things, predicted mortality after reconstruction.
Graft infections were primarily polymicrobial and fungal, and were diagnosed a mean of 716 days after the initial implant, generally by CT. Symptoms included pain in 66%, mostly in the back and abdomen, and fever and chills, also in 66%. Streptococcus, Escherichia coli, and both methicillin-sensitive and -resistant Staphylococcus aureus were among the most commonly isolated organisms. No particular type of graft seemed more likely to get infected.
The sources of infection are unknown, but index procedures were complicated by urinary tract, groin, and other infections in about one-third of patients. About one-third also had interval procedures, including endoleak intervention. About 14% of patients were thought to have had a contaminated index procedure.
Patients stayed in the hospital a mean of 24 days after reconstruction. Early complications included persistent sepsis in 27 patients, myocardial infarction in 9, recurrent infection in 9, and pneumonia in 8. Mortality at 30 days was 11%.
Nineteen replacement grafts – mostly unsoaked Dacron – were explanted after a mean of 540 days. Persistent sepsis after reconstruction was associated with unsoaked Dacron and polytetrafluoroethylene (PTFE) grafts.
To prevent graft infections, Mayo patients “take an antibiotic for any invasive procedure,” including dental work. “I am not sure we have data to support that, but it is something we do,” Dr. Duncan said.
Dr. Duncan has no relevant disclosures.
AT THE 2015 VASCULAR ANNUAL MEETING
Key clinical point: Neoaortoiliac system reconstruction or cryopreserved allografts perform best in replacing infected aortic endografts, but if you have to use prosthetics, use ones soaked in antibiotics.
Major finding: Seventy-five patients had cryoallograft reconstruction, mostly cryoartery, or neoaortoiliac system (NAIS) reconstruction with femoral vein; 53 (71%) were alive at 5 years. Two of 19 (11%) patients reconstructed with untreated prosthetic grafts survived that long.
Data source: Review of 206 patients at 19 vascular surgery centers in the United States.
Disclosures: The lead investigator has no relevant disclosures.
ZEUS: Second-generation DES with 30 days DAPT best in bleeding-risk patients
PARIS – The use of a second-generation zotarolimus-eluting coronary stent rather than a bare metal stent in conjunction with 30 days of dual antiplatelet therapy (DAPT) in patients deemed at high bleeding risk results in lower 1-year rates of major adverse cardiovascular events and stent thrombosis, according to a prespecified analysis of the ZEUS trial presented at the annual congress of the European Association of Percutaneous Cardiovascular Interventions.
Asked if the ZEUS results mean it’s time to take bare-metal stents (BMS) out of the cupboard and get rid of them, the study presenter, Dr. Marco Valgimigli, replied “I did so already.”
ZEUS (Zotarolimus-Eluting Endeavor Sprint Stent in Uncertain DES Candidates Study) was an open-label, prospective study in which 1,606 patients undergoing urgent or emergent percutaneous coronary intervention were randomized to a thin-strut BMS or the zotarolimus-eluting Endeavor Sprint stent, a second-generation hydrophilic polymer-based device that, uniquely, elutes 100% of the drug within the first 2 weeks. All participants were placed on an intended 30-day regimen of DAPT. The study was conducted in four European countries, explained Dr. Valgimigli of Erasmus University in Rotterdam.
This prespecified analysis focused on the 828 patients with one or more factors placing them at high bleeding risk, since the use of a drug-eluting stent (DES) with a 30-day DAPT protocol hadn’t been adequately studied in that setting, the cardiologist noted.
High bleeding risk was defined by one or more of the following: age greater than 80, being on oral anticoagulation therapy, a prior bleeding event, need for corticosteroid or NSAID therapy, known anemia, or a bleeding diathesis; 47% of study participants had more than one of these criteria.
The primary study endpoint was the composite of all-cause mortality, acute MI, or target vessel revascularization through 1 year of follow-up. The rate was 29% in the BMS group, compared with 22.6% in the DES group, for a highly significant 26% relative risk reduction.
The zotarolimus-eluting stent group also fared significantly better in terms of stent thrombosis and the other prespecified secondary endpoints.
Asked how the ZEUS findings have affected his own clinical practice, Dr. Valgimigli replied, “My stent of choice in patients at high bleeding risk is a second-generation DES. Since there aren’t data showing a specific second-generation DES is preferable, basically whatever I have I implant.”
He sticks to the 30-day DAPT regimen featured in the ZEUS protocol except under specific circumstances, which were allowed under the protocol. One involves staged PCI procedures, in which case the 30 days of DAPT begins after the last stent is implanted, even though the patient has been on DAPT in the interim. The other circumstance where he goes beyond 30 days of DAPT in a patient on a second-generation DES is if an ischemic event occurs down the road: “That patient is put back on DAPT and left there,” he said.
In response to another question, Dr. Valgimigli said he doesn’t believe the lower stent thrombosis rate seen in the Endeavor Sprint group in ZEUS is unique to that stent.
“If you look at any BMS versus DES study, taking the first-generation DES out of the picture, it’s quite clear that the second-generation DES are much safer than a BMS,” according to the cardiologist.
The ZEUS study was sponsored by the University of Ferrara (Italy) and funded by Medtronic. Dr. Valgimigli serves as a consultant to and/or on speakers’ bureaus for Medtronic and more than half a dozen other pharmaceutical and medical devices companies.
PARIS – The use of a second-generation zotarolimus-eluting coronary stent rather than a bare metal stent in conjunction with 30 days of dual antiplatelet therapy (DAPT) in patients deemed at high bleeding risk results in lower 1-year rates of major adverse cardiovascular events and stent thrombosis, according to a prespecified analysis of the ZEUS trial presented at the annual congress of the European Association of Percutaneous Cardiovascular Interventions.
Asked if the ZEUS results mean it’s time to take bare-metal stents (BMS) out of the cupboard and get rid of them, the study presenter, Dr. Marco Valgimigli, replied “I did so already.”
ZEUS (Zotarolimus-Eluting Endeavor Sprint Stent in Uncertain DES Candidates Study) was an open-label, prospective study in which 1,606 patients undergoing urgent or emergent percutaneous coronary intervention were randomized to a thin-strut BMS or the zotarolimus-eluting Endeavor Sprint stent, a second-generation hydrophilic polymer-based device that, uniquely, elutes 100% of the drug within the first 2 weeks. All participants were placed on an intended 30-day regimen of DAPT. The study was conducted in four European countries, explained Dr. Valgimigli of Erasmus University in Rotterdam.
This prespecified analysis focused on the 828 patients with one or more factors placing them at high bleeding risk, since the use of a drug-eluting stent (DES) with a 30-day DAPT protocol hadn’t been adequately studied in that setting, the cardiologist noted.
High bleeding risk was defined by one or more of the following: age greater than 80, being on oral anticoagulation therapy, a prior bleeding event, need for corticosteroid or NSAID therapy, known anemia, or a bleeding diathesis; 47% of study participants had more than one of these criteria.
The primary study endpoint was the composite of all-cause mortality, acute MI, or target vessel revascularization through 1 year of follow-up. The rate was 29% in the BMS group, compared with 22.6% in the DES group, for a highly significant 26% relative risk reduction.
The zotarolimus-eluting stent group also fared significantly better in terms of stent thrombosis and the other prespecified secondary endpoints.
Asked how the ZEUS findings have affected his own clinical practice, Dr. Valgimigli replied, “My stent of choice in patients at high bleeding risk is a second-generation DES. Since there aren’t data showing a specific second-generation DES is preferable, basically whatever I have I implant.”
He sticks to the 30-day DAPT regimen featured in the ZEUS protocol except under specific circumstances, which were allowed under the protocol. One involves staged PCI procedures, in which case the 30 days of DAPT begins after the last stent is implanted, even though the patient has been on DAPT in the interim. The other circumstance where he goes beyond 30 days of DAPT in a patient on a second-generation DES is if an ischemic event occurs down the road: “That patient is put back on DAPT and left there,” he said.
In response to another question, Dr. Valgimigli said he doesn’t believe the lower stent thrombosis rate seen in the Endeavor Sprint group in ZEUS is unique to that stent.
“If you look at any BMS versus DES study, taking the first-generation DES out of the picture, it’s quite clear that the second-generation DES are much safer than a BMS,” according to the cardiologist.
The ZEUS study was sponsored by the University of Ferrara (Italy) and funded by Medtronic. Dr. Valgimigli serves as a consultant to and/or on speakers’ bureaus for Medtronic and more than half a dozen other pharmaceutical and medical devices companies.
PARIS – The use of a second-generation zotarolimus-eluting coronary stent rather than a bare metal stent in conjunction with 30 days of dual antiplatelet therapy (DAPT) in patients deemed at high bleeding risk results in lower 1-year rates of major adverse cardiovascular events and stent thrombosis, according to a prespecified analysis of the ZEUS trial presented at the annual congress of the European Association of Percutaneous Cardiovascular Interventions.
Asked if the ZEUS results mean it’s time to take bare-metal stents (BMS) out of the cupboard and get rid of them, the study presenter, Dr. Marco Valgimigli, replied “I did so already.”
ZEUS (Zotarolimus-Eluting Endeavor Sprint Stent in Uncertain DES Candidates Study) was an open-label, prospective study in which 1,606 patients undergoing urgent or emergent percutaneous coronary intervention were randomized to a thin-strut BMS or the zotarolimus-eluting Endeavor Sprint stent, a second-generation hydrophilic polymer-based device that, uniquely, elutes 100% of the drug within the first 2 weeks. All participants were placed on an intended 30-day regimen of DAPT. The study was conducted in four European countries, explained Dr. Valgimigli of Erasmus University in Rotterdam.
This prespecified analysis focused on the 828 patients with one or more factors placing them at high bleeding risk, since the use of a drug-eluting stent (DES) with a 30-day DAPT protocol hadn’t been adequately studied in that setting, the cardiologist noted.
High bleeding risk was defined by one or more of the following: age greater than 80, being on oral anticoagulation therapy, a prior bleeding event, need for corticosteroid or NSAID therapy, known anemia, or a bleeding diathesis; 47% of study participants had more than one of these criteria.
The primary study endpoint was the composite of all-cause mortality, acute MI, or target vessel revascularization through 1 year of follow-up. The rate was 29% in the BMS group, compared with 22.6% in the DES group, for a highly significant 26% relative risk reduction.
The zotarolimus-eluting stent group also fared significantly better in terms of stent thrombosis and the other prespecified secondary endpoints.
Asked how the ZEUS findings have affected his own clinical practice, Dr. Valgimigli replied, “My stent of choice in patients at high bleeding risk is a second-generation DES. Since there aren’t data showing a specific second-generation DES is preferable, basically whatever I have I implant.”
He sticks to the 30-day DAPT regimen featured in the ZEUS protocol except under specific circumstances, which were allowed under the protocol. One involves staged PCI procedures, in which case the 30 days of DAPT begins after the last stent is implanted, even though the patient has been on DAPT in the interim. The other circumstance where he goes beyond 30 days of DAPT in a patient on a second-generation DES is if an ischemic event occurs down the road: “That patient is put back on DAPT and left there,” he said.
In response to another question, Dr. Valgimigli said he doesn’t believe the lower stent thrombosis rate seen in the Endeavor Sprint group in ZEUS is unique to that stent.
“If you look at any BMS versus DES study, taking the first-generation DES out of the picture, it’s quite clear that the second-generation DES are much safer than a BMS,” according to the cardiologist.
The ZEUS study was sponsored by the University of Ferrara (Italy) and funded by Medtronic. Dr. Valgimigli serves as a consultant to and/or on speakers’ bureaus for Medtronic and more than half a dozen other pharmaceutical and medical devices companies.
AT EUROPCR 2015
Key clinical point: High bleeding risk patients fare significantly better with a second-generation drug-eluting stent and 30 days of dual antiplatelet therapy than with a bare-metal stent.
Major finding: The 1-year incidence of major adverse cardiovascular events was 29.6% in high bleeding risk patients who received a bare-metal stent and 22.6% in those who got a second-generation zotarolimus-eluting stent with 30 days of dual antiplatelet therapy.
Data source: This was a prespecified analysis of 828 high bleeding risk patients randomized to a bare-metal stent or a second-generation zotarolimus-eluting stent in conjunction with 30 days of DAPT and then followed prospectively for 1 year.
Disclosures: The ZEUS study was sponsored by the University of Ferrara (Italy) and funded by Medtronic. The presenter serves as a consultant and/or on speakers’ bureaus for Medtronic and more than half a dozen other pharmaceutical and medical devices companies.
Risk of major bleeding is decreased when AF patients do not receive bridging anticoagulation
TORONTO – Forgoing bridging anticoagulation in patients with atrial fibrillation (AF) is noninferior to perioperative bridging with low-molecular-weight heparin for the prevention of arterial thromboembolism and decreases the risk of major bleeding.
Those results emerged from trial data presented at the International Society on Thrombosis and Haemostasis congress and published simultaneously in the New England Journal of Medicine. Study investigator Dr. Thomas Ortel, chief of the division of hematology at Duke University Medical Center, Durham, N.C., discussed results of the BRIDGE (Effectiveness of Bridging Anticoagulation for Surgery) trial, which evaluated the safety and efficacy of bridging anticoagulant therapy.
Bridging anticoagulation is frequently used in patients taking chronic oral anticoagulant therapy who need their anticoagulation transiently held for an operation or invasive procedure. The need for bridging anticoagulation never has been shown definitively, however, Dr. Ortel said in an interview.
“This is the first prospective, randomized, placebo-controlled, double-blind clinical trial to investigate the role of bridging anticoagulant therapy in patients with AF on chronic anticoagulation with warfarin who need the anticoagulant therapy held for an elective operation or invasive procedure,” he said.
Dr. Ortel and his coauthors evaluated 1,884 patients in the trial, which compared bridging and no bridging in patients with nonvalvular/valvular AF or atrial flutter who required warfarin interruption for elective surgery. The median age was 72.7 years, and 73% of patients were male. A total of 336 patients had a history of stroke or transient ischemic attack.
After stopping warfarin 5 days before the procedure, study participants received dalteparin 100 IU/kg (934 patients) or matching placebo (950 patients) for 3 days before and 5-9 days after the procedure. Dalteparin/placebo was resumed 12-24 hours after minor surgery and 48-72 hours after major surgery.
Warfarin was resumed 24 hours or less after the procedure. Follow-up lasted 30 ± 7 days after the procedure. Primary outcomes were arterial thromboembolism and major bleeding. Secondary outcomes were minor bleeding, death, myocardial infarction, and venous thromboembolism.
Protocol adherence occurred in 81% of patients before the procedure, and in 94.5% of patients post procedure.
The incidence of arterial thromboembolism was 0.4% in the no-bridging group, compared with 0.3% in the bridging group (95% confidence interval, –0.6 to 0.8; P = .01 for noninferiority). The incidence of major bleeding was 1.3% in the no-bridging group and 3.2% in the bridging group (relative risk, 0.41; 95% CI, 0.20-0.78; P = .005 for superiority).
“Current practice guidelines provide weak and inconsistent recommendations concerning the need for bridging anticoagulation,” Dr. Ortel said. “This study provides the highest level of evidence to support a strong recommendation concerning the role of bridging in this patient population.”
It is estimated that approximately one in six warfarin-treated patients with AF will need anticoagulation transiently held for an elective operation or invasive procedure each year, making this a common clinical scenario for providers, Dr. Ortel said. Knowing the findings from the BRIDGE trial will help guide clinicians in making decisions when this situation arises in their patients, he concluded.
“With the introduction of the direct oral anticoagulants, we will now need to develop periprocedural approaches to manage patients on a variety of different agents,” he said. “Warfarin continues to be extensively used in many of these patients, however, and the BRIDGE trial will contribute to improved management for these individuals.”
In response to an audience member’s question about which patients should receive bridging anticoagulation, Dr. Ortel said that “right now, our data would suggest that for AF patients, we don’t need to bridge.”
“I can’t say that, necessarily, for prosthetic heart valves or for venous thromboembolism. I think some of the recommendations that you’ve seen in the guidelines where people try to stratify this by how recently they had thromboembolism or by what type of heart valve they have – those might be the higher-risk patients to consider. But that’s all based on existing guidelines and no prospective data, so I feel comfortable telling you who you don’t need to bridge in, but I’m not going to tell you who you should,” he added.
The BRIDGE Trial was sponsored by the National Heart, Lung, and Blood Institute. Dr. Ortel disclosed grant/research support from Eisai Co. Ltd and Pfizer Inc.
TORONTO – Forgoing bridging anticoagulation in patients with atrial fibrillation (AF) is noninferior to perioperative bridging with low-molecular-weight heparin for the prevention of arterial thromboembolism and decreases the risk of major bleeding.
Those results emerged from trial data presented at the International Society on Thrombosis and Haemostasis congress and published simultaneously in the New England Journal of Medicine. Study investigator Dr. Thomas Ortel, chief of the division of hematology at Duke University Medical Center, Durham, N.C., discussed results of the BRIDGE (Effectiveness of Bridging Anticoagulation for Surgery) trial, which evaluated the safety and efficacy of bridging anticoagulant therapy.
Bridging anticoagulation is frequently used in patients taking chronic oral anticoagulant therapy who need their anticoagulation transiently held for an operation or invasive procedure. The need for bridging anticoagulation never has been shown definitively, however, Dr. Ortel said in an interview.
“This is the first prospective, randomized, placebo-controlled, double-blind clinical trial to investigate the role of bridging anticoagulant therapy in patients with AF on chronic anticoagulation with warfarin who need the anticoagulant therapy held for an elective operation or invasive procedure,” he said.
Dr. Ortel and his coauthors evaluated 1,884 patients in the trial, which compared bridging and no bridging in patients with nonvalvular/valvular AF or atrial flutter who required warfarin interruption for elective surgery. The median age was 72.7 years, and 73% of patients were male. A total of 336 patients had a history of stroke or transient ischemic attack.
After stopping warfarin 5 days before the procedure, study participants received dalteparin 100 IU/kg (934 patients) or matching placebo (950 patients) for 3 days before and 5-9 days after the procedure. Dalteparin/placebo was resumed 12-24 hours after minor surgery and 48-72 hours after major surgery.
Warfarin was resumed 24 hours or less after the procedure. Follow-up lasted 30 ± 7 days after the procedure. Primary outcomes were arterial thromboembolism and major bleeding. Secondary outcomes were minor bleeding, death, myocardial infarction, and venous thromboembolism.
Protocol adherence occurred in 81% of patients before the procedure, and in 94.5% of patients post procedure.
The incidence of arterial thromboembolism was 0.4% in the no-bridging group, compared with 0.3% in the bridging group (95% confidence interval, –0.6 to 0.8; P = .01 for noninferiority). The incidence of major bleeding was 1.3% in the no-bridging group and 3.2% in the bridging group (relative risk, 0.41; 95% CI, 0.20-0.78; P = .005 for superiority).
“Current practice guidelines provide weak and inconsistent recommendations concerning the need for bridging anticoagulation,” Dr. Ortel said. “This study provides the highest level of evidence to support a strong recommendation concerning the role of bridging in this patient population.”
It is estimated that approximately one in six warfarin-treated patients with AF will need anticoagulation transiently held for an elective operation or invasive procedure each year, making this a common clinical scenario for providers, Dr. Ortel said. Knowing the findings from the BRIDGE trial will help guide clinicians in making decisions when this situation arises in their patients, he concluded.
“With the introduction of the direct oral anticoagulants, we will now need to develop periprocedural approaches to manage patients on a variety of different agents,” he said. “Warfarin continues to be extensively used in many of these patients, however, and the BRIDGE trial will contribute to improved management for these individuals.”
In response to an audience member’s question about which patients should receive bridging anticoagulation, Dr. Ortel said that “right now, our data would suggest that for AF patients, we don’t need to bridge.”
“I can’t say that, necessarily, for prosthetic heart valves or for venous thromboembolism. I think some of the recommendations that you’ve seen in the guidelines where people try to stratify this by how recently they had thromboembolism or by what type of heart valve they have – those might be the higher-risk patients to consider. But that’s all based on existing guidelines and no prospective data, so I feel comfortable telling you who you don’t need to bridge in, but I’m not going to tell you who you should,” he added.
The BRIDGE Trial was sponsored by the National Heart, Lung, and Blood Institute. Dr. Ortel disclosed grant/research support from Eisai Co. Ltd and Pfizer Inc.
TORONTO – Forgoing bridging anticoagulation in patients with atrial fibrillation (AF) is noninferior to perioperative bridging with low-molecular-weight heparin for the prevention of arterial thromboembolism and decreases the risk of major bleeding.
Those results emerged from trial data presented at the International Society on Thrombosis and Haemostasis congress and published simultaneously in the New England Journal of Medicine. Study investigator Dr. Thomas Ortel, chief of the division of hematology at Duke University Medical Center, Durham, N.C., discussed results of the BRIDGE (Effectiveness of Bridging Anticoagulation for Surgery) trial, which evaluated the safety and efficacy of bridging anticoagulant therapy.
Bridging anticoagulation is frequently used in patients taking chronic oral anticoagulant therapy who need their anticoagulation transiently held for an operation or invasive procedure. The need for bridging anticoagulation never has been shown definitively, however, Dr. Ortel said in an interview.
“This is the first prospective, randomized, placebo-controlled, double-blind clinical trial to investigate the role of bridging anticoagulant therapy in patients with AF on chronic anticoagulation with warfarin who need the anticoagulant therapy held for an elective operation or invasive procedure,” he said.
Dr. Ortel and his coauthors evaluated 1,884 patients in the trial, which compared bridging and no bridging in patients with nonvalvular/valvular AF or atrial flutter who required warfarin interruption for elective surgery. The median age was 72.7 years, and 73% of patients were male. A total of 336 patients had a history of stroke or transient ischemic attack.
After stopping warfarin 5 days before the procedure, study participants received dalteparin 100 IU/kg (934 patients) or matching placebo (950 patients) for 3 days before and 5-9 days after the procedure. Dalteparin/placebo was resumed 12-24 hours after minor surgery and 48-72 hours after major surgery.
Warfarin was resumed 24 hours or less after the procedure. Follow-up lasted 30 ± 7 days after the procedure. Primary outcomes were arterial thromboembolism and major bleeding. Secondary outcomes were minor bleeding, death, myocardial infarction, and venous thromboembolism.
Protocol adherence occurred in 81% of patients before the procedure, and in 94.5% of patients post procedure.
The incidence of arterial thromboembolism was 0.4% in the no-bridging group, compared with 0.3% in the bridging group (95% confidence interval, –0.6 to 0.8; P = .01 for noninferiority). The incidence of major bleeding was 1.3% in the no-bridging group and 3.2% in the bridging group (relative risk, 0.41; 95% CI, 0.20-0.78; P = .005 for superiority).
“Current practice guidelines provide weak and inconsistent recommendations concerning the need for bridging anticoagulation,” Dr. Ortel said. “This study provides the highest level of evidence to support a strong recommendation concerning the role of bridging in this patient population.”
It is estimated that approximately one in six warfarin-treated patients with AF will need anticoagulation transiently held for an elective operation or invasive procedure each year, making this a common clinical scenario for providers, Dr. Ortel said. Knowing the findings from the BRIDGE trial will help guide clinicians in making decisions when this situation arises in their patients, he concluded.
“With the introduction of the direct oral anticoagulants, we will now need to develop periprocedural approaches to manage patients on a variety of different agents,” he said. “Warfarin continues to be extensively used in many of these patients, however, and the BRIDGE trial will contribute to improved management for these individuals.”
In response to an audience member’s question about which patients should receive bridging anticoagulation, Dr. Ortel said that “right now, our data would suggest that for AF patients, we don’t need to bridge.”
“I can’t say that, necessarily, for prosthetic heart valves or for venous thromboembolism. I think some of the recommendations that you’ve seen in the guidelines where people try to stratify this by how recently they had thromboembolism or by what type of heart valve they have – those might be the higher-risk patients to consider. But that’s all based on existing guidelines and no prospective data, so I feel comfortable telling you who you don’t need to bridge in, but I’m not going to tell you who you should,” he added.
The BRIDGE Trial was sponsored by the National Heart, Lung, and Blood Institute. Dr. Ortel disclosed grant/research support from Eisai Co. Ltd and Pfizer Inc.
FROM 2015 ISTH CONGRESS
Key clinical point: Forgoing bridging anticoagulation in patients with atrial fibrillation is noninferior to perioperative bridging for preventing arterial thromboembolism and decreasing the risk of major bleeding.
Major finding: The incidence of arterial thromboembolism was 0.4% vs. 0.3% in the no-bridging and bridging groups, respectively. The incidence of major bleeding was 1.3% in the no-bridging group and 3.2% in the bridging group.
Data source: A prospective, randomized, placebo-controlled, double-blind trial of 1,884 patients with nonvalvular/valvular AF or atrial flutter who required warfarin interruption for elective surgery.
Disclosures: The BRIDGE Trial was sponsored by the National Heart, Lung, and Blood Institute. Dr. Ortel disclosed grant/research support from Eisai Co. Ltd and Pfizer Inc.