Free fatty acids (FFA) are promising biomarkers for the diagnosis and assessment of several diseases. They have been associated with cardiovascular diseases, such as insulin resistance, arteriosclerosis, myocardial dysfunction, cardiac arrhythmias, and sudden cardiac death; thus, it is important to study the relationship between FFA and atrial fibrillation (AF), especially whether FFAs can predict AF recurrence after catheter ablation.
Patients with symptomatic paroxysmal or persistent AF undergoing radiofrequency catheter ablation for the first time were included in the study. Plasma FFA levels were measured upon admission and 3 months after ablation.
A total of 88 patients with AF (55 males, 33 females; mean age, 62.5 ± 8.7 years) were included for analysis. FFA levels upon admission in patients with paroxysmal and persistent AF were 0.38 ± 0.16 and 0.37 ± 0.15 mmol/L, respectively. During the 3-month follow-up after radiofrequency ablation, FFA concentration in patients with paroxysmal and persistent AF were 0.38 ± 0.18 and 0.36 ± 0.18 mmol/L, respectively. FFA concentration in patients with and without AF recurrence were 0.69 ± 0.07 and 0.33 ± 0.14 mmol/L, respectively. Kaplan–Meier analysis showed that AF recurrence was significantly higher in patients with FFA ≥0.53 mmol/L than in patients with FFA <0.53 mmol/L (p < 0.001). FFA concentration at 3 months post-ablation was an independent predictor of AF recurrence in patients who underwent catheter ablation (hazard ratio = 10.45, 95% CI [8.61–25.33], p = 0.03).
Elevated postoperative FFA levels were closely related to AF recurrence at the 1-year follow-up, implying that postoperative FFA levels can be used as a predictive biomarker for AF recurrence. FFAs may be used as a new therapeutic target for the prevention and treatment of AF.
Acute myocardial infarction (AMI) remains a global health challenge. This has driven innovation toward precision medicine, including major advances in several key areas. Precision Reperfusion: Intravascular ultrasound (IVUS), optical coherence tomography (OCT) and fractional flow reserve (FFR) can be used to optimize stent deployment, thereby reducing thrombosis and restenosis. Bioabsorbable stents and drug-coated balloons (DCBs) show promise in minimizing long-term complications. Mechanical Circulatory Support (MCS): Early use of Impella and veno-arterial extracorporeal membrane oxygenation (VA-ECMO) has been shown to improve survival in select AMI-cardiogenic shock patients, although device selection and timing require further validation. Antiplatelet Personalization: Genotyping (e.g., CYP2C19) and testing of platelet function enables tailored dual antiplatelet therapy (DAPT), thus balancing ischemic and bleeding risks. Regenerative Therapies: Extracellular vesicles (EVs) from stem cells or cardiac progenitors have shown cardioprotective effects in preclinical models, addressing limitations of cell-based approaches. Artificial intelligence (AI)-driven platforms can optimize EV delivery and tissue repair. AI-Enhanced Diagnostics: Machine learning models improve Electrocardiogram (ECG) interpretation, risk stratification, and the detection of ST-segment elevation myocardial infarction (STEMI). This review aims to provide a theoretical foundation for practical clinical applications in the treatment of AMI.
Thoracic injuries requiring surgical intervention remain an important consideration in blunt and penetrating trauma with exceedingly high morbidity and mortality. In the United States, much of modern-day management of intrathoracic injuries has been derived from military medical experience. However, thoracic vascular injuries account for only 6% of thoracic trauma, leading to decreased preparedness to address such injuries. To address this knowledge gap, a literature review was conducted to examine the operative techniques for management of intrathoracic hemorrhage from direct cardiac injuries, great vessel injuries, and pulmonary injuries. A literature review was conducted via PubMed utilizing key terms “traumatic thoracic hemorrhage”, “traumatic cardiac injury”, “traumatic great vessel injury”, and “traumatic pulmonary injury”, “penetrating cardiac trauma”, “anterolateral thoracotomy”, “trauma extracorporeal membrane oxygenation (ECMO)”, “thoracic damage control surgery”, including studies from 1987 to present. Citation chaining and author discretion were also used to identify relevant articles for inclusion. Two primary operative approaches, the anterolateral thoracotomy and median sternotomy, provide adequate exposure to repair most intrathoracic injuries. Direct cardiac injuries are best repaired using permanent pledgeted sutures. Repair of traumatic great vessel injuries presents a significant challenge, often necessitating extension of the initial incision to enable proximal and distal vascular control when endovascular options are unavailable. Traumatic pulmonary injuries often require non-anatomic lung resection. Many aspects of care for intrathoracic hemorrhage in the civilian setting apply to battlefield management, however specific considerations such as availability of resources and patient transport are important in the context of potential for prolonged field care. In the future, there may be a role for venovenous and venoarterial extracorporeal membrane oxygenation. Algorithmic, flexible, and effective management considerations offer the greatest utility in management of high mortality injuries leading to intrathoracic hemorrhage in both civilian and military settings. New possible avenues for extracorporeal support in battlefield management offer additional approaches in care in the resource limited environment.
Home health care (HHC) may help reduce the burden on patients and families after interventions and potentially reduce hospital length of stay (LOS). This study aimed to assess the outcomes of patients undergoing transcatheter aortic valve replacement (TAVR) who were discharged with or without HHC services.
This retrospective analysis utilized the Nationwide Readmissions Database (NRD) to identify TAVR patients (2010–2018) who were categorized based on discharge disposition into either the HHC cohort or the routine cohort. Propensity-matched outcomes are reported. Additionally, a multivariate logistic regression analysis was performed to examine the 30-day readmission rate.
A total of 94,491 patients undergoing TAVR were included; 66.9% were routinely discharged, while 33.1% were discharged with HHC. The median age was higher in the HHC cohort (83.0 vs. 81.0 years; p < 0.01), which also comprised a greater proportion of women (48.7% vs. 41.8%; p < 0.01). After propensity matching, the LOS was longer in the HHC cohort (4.0 days [2.0–7.0] vs. 3.0 days [2.0–5.0]; p < 0.01). The 30-day readmission rate (19.9% vs. 15.8%; p < 0.01) and mortality rate (0.4% vs. 0.3%; p < 0.01) also remained higher in the HHC cohort. In the logistic regression analysis, the HHC status (odds ratio (OR): 1.34 [95% confidence interval (CI): 1.30–1.38]; p < 0.01), myocardial infarction (MI) (1.24 [1.15–1.33]; p < 0.01), paraplegia (2.10 [1.30–3.39]; p < 0.01), bowel ischemia (1.42 [1.07–1.88]; p = 0.02), and acute kidney injury (1.27 [1.22–1.33]; p < 0.01) were associated with 30-day readmission.
In conclusion, post-TAVR utilization of HHC services was associated with higher in-hospital complications and increased odds of 30-day readmissions. Thus, optimizing procedures for routine discharge and refining the criteria for HHC may help improve outcomes.
Takayasu arteritis (TAK), a chronic inflammatory condition often leading to aortic dilation and regurgitation (AR), poses significant surgical challenges due to vascular fragility and calcification, increasing risks of complications. For high-risk TAK patients with severe AR, transcatheter aortic valve replacement (TAVR) may be a viable alternative.
A 49-year-old woman with TAK presented with severe AR, porcelain aorta, and aortic branch malformation. She successfully underwent transapical TAVR with a 25 mm J-valve. Post-procedural follow-up at two years showed notable improvement in left ventricular dimensions and ejection fraction.
TAVR with the J-valve appears to be an effective and safe treatment for high-risk TAK patients with severe AR and porcelain aorta, providing satisfactory mid-term outcomes. This minimally invasive approach represents a valuable option when surgery is contraindicated due to anatomical complexity and tissue fragility.
The intra-aortic balloon pump (IABP) is a relatively economical device for providing temporary cardiac support in cases of cardiac dysfunction or of ongoing cardiac ischemia.
This brief review describes the techniques and strategies for using the IABP to help trainees, nurses, intensivists, and other practitioners who may lack familiarity with it and need to assist in managing patients who have had one of these devices inserted in a catheterization lab or operating room.
These devices have proven useful in supporting cardiac patients and can be managed by all who care for these patients.
Following the first orthotopic heart transplant performed in 1967, pediatric heart transplantation procedures, have undergone significant advances over the last five decades. Subsequently, survival times have improved over the years, whereby those receiving an orthotopic heart transplant now survive for decades longer. A significant advancement in this area involves the management of blood type (ABO) incompatibility. Recent protocols and antibody-mediated therapies have made ABO-incompatible transplants more feasible, improving graft survival. Real-time evaluation and optimization of donor hearts have also been revolutionized by expanding donor sources through donation after circulatory death. Innovation in management and preservation techniques has demonstrated that donations after circulatory death have acceptable post-transplant outcomes. Immunosuppressive therapy has also evolved with the emergence of tacrolimus monotherapy, which is gaining attention as a potential strategy for reducing the risks associated with polypharmacy while maintaining graft function. Moreover, ex-vivo perfusion systems have optimized donor heart preservation by reducing cold ischemia time and improving graft quality. With advancements in systems and processes, surgical procedures for partial heart transplantation have shown promise for selected patients. Ultimately, xenotransplantation is an emerging frontier in addressing the persistent organ shortage. Thus, this manuscript presents a comprehensive review of the progress in pediatric heart transplantation over the past decade, as well as the prospects for this field of research.
Myocardial protection during cardiac surgery is critical for preserving cardiac function, minimizing ischemia-reperfusion injury, and preventing myocardial stunning. Although advances in cardioplegia formulations, delivery techniques, pharmacologic agents, and controlled reperfusion strategies have entered clinical practice, substantial variability in protocols and limited comparative evidence continue to hinder optimization and standardization. This narrative review is aimed at highlighting critical current evidence, identifying key gaps in perioperative myocardial protection, and discussing emerging opportunities for innovation and personalized strategies. A comprehensive literature search was conducted in PubMed and Embase through June 2025, with a combination of MeSH terms and keywords related to cardioplegia and myocardial protection. High-quality studies, including randomized trials, systematic reviews, and authoritative expert opinions, were selected according to relevance to key themes, including cardioplegia types, delivery techniques, risk populations, pharmacologic adjuncts, and real-time monitoring technologies. No cardioplegia strategy is universally accepted, and cardioplegic formulation composition, temperature, and delivery methods widely vary across institutions. Although modified del Nido solutions have gained popularity, comparative evidence remains inconsistent. Small volume cardioplegia solutions (e.g., Cardioplexol®) show future promise. High-risk populations, such as those with diabetes or left ventricular hypertrophy, continue to experience suboptimal outcomes, probably because of distinct metabolic and structural vulnerabilities. Pharmacologic agents mimicking ischemic preconditioning have achieved limited translation into routine practice, and remote ischemic conditioning remains underused, owing to inconsistent evidence. Real-time intraoperative monitoring of myocardial injury and established threshold values for early myocardial injury biomarkers are notably lacking. Emerging modalities such as intramyocardial pH sensors and coronary sinus metabolite sampling offer promise for early injury detection but are far from achieving widespread use. Substantial gaps persist in the personalization and standardization of myocardial protection in cardiac surgery. Innovative approaches are required to advance intraoperative sensing technologies and adjunct protective interventions tailored to patient risk profiles. Incorporating artificial intelligence, leveraging omics data, and fostering multi-institutional collaboration are key steps toward a new era of precision myocardial protection.
To evaluate the neuroprotective efficacy of combining unilateral antegrade selective cerebral perfusion with percentage-controlled flow regulation during aortic arch reconstruction surgery for aortic dissection.
A retrospective analysis was conducted using clinical data from 226 consecutive patients who underwent surgery for acute aortic dissection with arch reconstruction at our hospital between January 2020 and January 2021. Based on the cerebral protection strategy used, patients were divided into two groups: the percentage-flow cerebral perfusion group (n = 89) and the control group (n = 137). The severity of neurological impairment was rigorously evaluated using standardized biomarker assessments, including serial measurements of serum S100β protein and neuron-specific enolase (NSE) levels. These biomarkers were systematically analyzed and compared between the two groups at two critical time points: preoperatively (baseline) and postoperatively during follow-up. Multivariate analysis was subsequently performed to identify independent risk factors associated with postoperative neurological dysfunction following surgical repair.
No statistically significant differences were observed in baseline characteristics or intraoperative parameters between the two groups (all p > 0.05). Postoperative mortality was comparable (4.5% vs. 4.4%, p = 0.915). However, the percentage-flow cerebral perfusion group showed a significantly lower incidence of neurological dysfunction—including both temporary and permanent deficits—compared to the conventional control group (8.98% vs. 18.98%, p = 0.031). Additionally, these patients demonstrated significantly shorter times to wakefulness and extubation (both p < 0.05). Serum biomarker analysis further indicated markedly elevated levels of S100β and NSE in the control group relative to the percentage-flow group (both p < 0.05). Univariate and multivariate regression analyses identified age, unilateral cerebral perfusion time, and cardiopulmonary bypass (CPB) time as independent risk factors for postoperative neurological injury. A predictive model incorporating these variables exhibited strong discriminative power (area under the curve, AUC = 0.838) and good stability (p = 0.256).
Optimized cerebral perfusion flow significantly shortens the time to awakening and extubation in patients undergoing acute aortic dissection repair, while reducing neurological injury, as supported by decreased serum levels of the biomarkers S100β and NSE. These results indicate a considerable neuroprotective benefit. Moreover, multivariate analysis confirmed that age, unilateral cerebral perfusion duration, and cardiopulmonary bypass (CPB) time are independent risk factors for postoperative neurological impairment. A predictive model integrating these factors exhibited strong clinical applicability.
Acute type A aortic dissection (ATAAD) complicated by preoperative shock is associated with fatal outcomes. Preoperative shock is caused by coronary malperfusion, cardiac tamponade, and aortic rupture. However, there were few reports about mid-to-long-term outcomes in patients with ATAAD complicated by preoperative shock.
Between October 2013 and November 2024, 181 patients with ATAAD underwent emergent aortic repair, including 44 (24.3%) with preoperative shock. Preoperative shock included cardiac tamponade, cardiopulmonary arrest, aortic rupture and coronary malperfusion. The mean age of patients was 60.4 ± 14.0 years. We analyzed postoperative outcomes in patients with ATAAD complicated by preoperative shock (shock group) compared to patients without shock (non-shock group).
Early mortality of the shock group was 43.2% (19/44), and 17.5% (24/137) in the non-shock group. There was a significant difference in early mortality between the two groups (p < 0.01). Logistic regression analysis demonstrated that older age and preoperative shock were significant predictors for early mortality (p < 0.01 and 0.02). The follow-up period was 34.0 ± 36.6 months. The cumulative survival rate in 10 years was 54.5% in the shock group, and 65.8% in the non-shock group. A significant difference was noted between the two groups (p < 0.01). On Cox proportional hazards regression analysis, preoperative shock was not an independent risk factor for cumulative survival.
The mid-to-long-term survival rate of acute type A aortic dissection patients with preoperative shock was not inferior to that of patients without shock if they survived after the first aortic repair. Preoperative shock was a risk factor for early mortality in this patient cohort.