While the invasive index of microcirculation resistance (IMR) remains the gold standard for diagnosing coronary microvascular dysfunction (CMD), its clinical adoption is limited by procedural complexity and cost. Angiography-based IMR (Angio-IMR), a computational angiography-based method, offers a promising alternative. This study evaluates the diagnostic efficacy of Angio-IMR for CMD detection in angina pectoris (AP).
A comprehensive literature search was conducted across PubMed, Embase, Scopus, and the Cochrane Library to identify studies assessing Angio-IMR's diagnostic performance for CMD in AP populations. Primary outcomes included pooled sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and area under the receiver operating characteristic (ROC) curve (AUC).
11 studies involving 927 patients were included. Angio-IMR demonstrated robust diagnostic performance: sensitivity 86% (95% CI: 0.83–0.90), specificity 90% (95% CI: 0.87–0.92), PPV 82% (95% CI: 0.78–0.86), NPV 91% (95% CI: 0.88–0.94), and AUC 0.91 (95% CI: 0.89–0.94), with low heterogeneity. Subgroup analyses revealed no significant differences in diagnostic accuracy between obstructive (stenosis ≥50%) and non-obstructive coronary artery disease. Hyperemic Angio-IMR measurements (adenosine-induced) showed superior sensitivity (89% vs. 86%) and specificity (94% vs. 91%) compared to resting-state assessments by AccuFFR system. Additionally, the sensitivity (88% vs. 82%), specificity (92% vs. 86%), PPV (82% vs. 78%) and NPV (91% vs. 88%) calculated based on AccuFFR were higher than that of quantitative flow ratio (QFR).
Angio-IMR is a reliable, non-invasive tool for CMD identification in angina patients, particularly under hyperemic conditions. Its diagnostic consistency across stenosis severity subgroups supports broad clinical applicability.
Globally, acute myocardial infarction (AMI) is among the primary causes of mortality. The ideal approach for blood pressure (BP) management for patients experiencing ST-segment elevation myocardial infarction (STEMI) who receive percutaneous coronary intervention (PCI) remains a topic of ongoing debate. Current guidelines on BP management lack specific recommendations for STEMI patients undergoing PCI, resulting in substantial individual variability and uncertainties in clinical treatment strategies. This research seeks to determine the ideal BP levels linked to the lowest risk of in-hospital mortality and long-term adverse endpoints in STEMI patients receiving PCI.
This retrospective study analyzed data from the China Acute Myocardial Infarction (CAMI) Registry, enrolling 10,482 STEMI patients undergoing PCI at 108 Chinese hospitals from January 2013 to September 2014. The primary outcome was in-hospital mortality. Secondary outcomes included 2-year all-cause mortality, severe bleeding, and major adverse cardiac and cerebrovascular events (MACCEs), defined as a combination of all-cause mortality, myocardial infarction (MI), or stroke. The analysis of the relationship between admission systolic blood pressure (SBP)/diastolic blood pressure (DBP) and the primary and secondary outcomes as continuous and categorical variables was conducted using restricted cubic spline (RCS) analysis and Cox regression models.
RCS analysis revealed that a J-shaped association existed between admission SBP/DBP and the risk of the primary outcome, with significant nonlinearity (both p < 0.001). Both lower and higher SBP/DBP levels were linked to an elevated risk of in-hospital mortality. The ideal SBP/DBP levels to minimize the in-hospital mortality risk were 157/94 mmHg. Compared to the reference SBP/DBP group (120–129/70–79 mmHg), lower admission SBP (<109 mmHg) or DBP (60–69 mmHg) significantly elevated the risk of the primary outcome. The adjusted hazard ratio (HR) for SBP levels of 100–109 mmHg and <100 mmHg was 1.08 (95% confidence interval (CI): 1.00–1.17; p = 0.0395 and p = 0.043, respectively), and for DBP of 60–69 mmHg, the adjusted HR was 1.07 (95% CI: 1.01–1.14, p = 0.0305). Similarly, the J-shaped curve was also noted between SBP/DBP and secondary outcomes, such as all-cause mortality, severe bleeding and MACCEs. However, no significant non–linear relationship was observed between SBP/DBP and recurrent MI at 2-year follow-up.
Among STEMI patients undergoing PCI, a J-curve relationship in in-hospital mortality was observed with a nadir at 157/94 mmHg. Similar J-shaped trends were also observed for secondary outcomes including all-cause mortality, severe bleeding and MACCEs. However, no significant nonlinear correlation was found between admission BP and recurrent MI within 2 years.
NCT01874691, https://www.clinicaltrials.gov/study/NCT01874691?term=NCT01874691&rank=1.
The incidence of unstable angina (UA), a type of cardiovascular disease (CVD), has increased in recent years. Meanwhile, timely percutaneous coronary intervention (PCI) or percutaneous transluminal coronary angioplasty (PTCA) procedures are crucial for patients with UA who also have diabetes mellitus (DM). Additionally, exploring other factors that may influence the prognosis of these patients could provide long-term benefits. The systemic immune-inflammation index (SII), a novel marker for assessing inflammation levels, has been shown to correlate with the long-term prognosis of various diseases. Thus, this study aimed to investigate the predictive value of the SII for the long-term prognosis of patients with UA and DM after revascularization.
A total of 937 UA patients who underwent revascularization, of which 359 also had DM, were included in this study. Patients were divided into two groups: the low SII group (<622.675 × 109/L; n = 219, 61.0%) and the high SII group (≥622.675 × 109/L; n = 140, 39.0%). The primary outcome was the frequency of major adverse cardiovascular and cerebrovascular events (MACCEs). The secondary outcome was the incidence of all-cause death.
Of the 359 patients who visited our institution between January 2018 and January 2020, 23 patients (10.5%) in the low SII group experienced MACCEs, whereas 34 cases (24.3%) in the high SII group experienced MACCEs, showing a statistically significant difference (p < 0.001). After conducting univariate and multivariate regression analyses on the endpoint events, we identified several risk factors for MACCEs. These risk factors included high SII levels, a history of myocardial infarction (MI), prior PCI or coronary artery bypass grafting (CABG), elevated brain natriuretic peptide (BNP), and the lack of angiotensin-converting enzyme inhibitors (ACEI) or statin use. Upon adjusting for covariates including age, sex, body mass index (BMI), BNP, smoking, hypertension, PCI or CABG history, MI history, statin use, ACEI use, and the presence of three-vessel coronary disease, only high SII levels remained a risk factor for MACCEs (HR: 0.155, 95% CI: 0.063–0.382; p = 0.001). However, high SII levels were not identified as a risk factor for other individual endpoint events, including non-fatal stroke, cardiovascular death, non-fatal MI, or cardiac rehospitalization.
Elevated SII levels following percutaneous intervention are associated with poor outcomes in patients with UA and DM. Therefore, regular monitoring and controlling inflammation levels may help improve long-term outcomes.
Coagulation disorders are potentially one of the most important pathogeneses of acute respiratory distress syndrome (ARDS) following acute type A aortic dissection (ATAAD). This study aimed to determine whether aortic dissection singularly and cardiopulmonary bypass (CPB) surgery can activate coagulation pathways, promoting ARDS development in patients with ATAAD.
A total of 450 patients who received treatment at Beijing Anzhen Hospital, Capital Medical University, between March 2023 and February 2024 were consecutively enrolled in this prospective cohort study. We analyzed the clinical factors and measured serum coagulation biomarkers by enzyme-linked immunosorbent assay (ELISA) among patients with ATAAD, aortic aneurysm (AA), or unstable angina (UA). Logistic regression, two-way analysis of variance (ANOVA), and Spearman's correlation analysis were performed. Furthermore, the patients with ATAAD were divided into ARDS (based on chest radiographic findings and an oxygenation index ≤300 mmHg) and non-ARDS groups for subgroup comparisons.
The incidence of postoperative ARDS among patients with ATAAD was 20.7% (13.3% in the AA group and 7.3% in the UA group). Preoperatively, prothrombin time (PT) was longer in patients with ATAAD than in those with AA or UA ((odds ratio (OR): 12.0, 95% confidence interval (CI): 11.5–12.6) vs. (OR: 11.4, 95% CI: 10.9–12.1) vs. (OR: 11.2, 95% CI: 10.8–11.6), respectively; p < 0.001). The D-dimer levels, fibrin degradation products (FDPs), factor XIIa, and factor VIII-Ag (FVIII-Ag) were significantly elevated preoperatively and postoperatively in patients with ATAAD. The FDP levels in the ATAAD subgroup immediately after surgery were significantly higher in the ARDS group compared with those in the non-ARDS group (OR: 2.26, 95% CI: 1.13–4.54; p = 0.022). In addition, a negative correlation existed between the FXII level (correlation coefficient r = –0.682, p = 0.043) at 24 hours after surgery and the oxygenation index.
Coagulation activation may be caused by aortic dissection singularly and CPB, which promotes postoperative ARDS in patients with ATAAD.
The AMP-activated protein kinase (AMPK) alpha (AMPKα) subunit is the catalytic subunit in the AMPK complex and includes both α1 and α2 isoforms. Phosphorylation of upstream kinases at the Thr172 site in the α-subunit is critical for AMPK activation. The kinases upstream of AMPK include liver kinase B1 (LKB1), calcium/calmodulin-dependent protein kinase kinase β (CaMKKβ), and transforming growth factor β-activated kinase 1 (TAK1). LKB1 predominantly regulates the AMPKα2 isoforms, whereas the phosphorylating roles of CaMKK and TAK1 in different isoforms of AMPKα have yet to be properly defined. Moreover, the understanding of the roles of AMPKα1 and α2 remains limited. Significant differences exist between the AMPKα1 and AMPKα2 isoforms regarding tissue distribution, cellular localization, and cardiac-unique roles, with AMPKα2 being the predominant catalytic isoform in the heart. During heart failure (HF), activated AMPKα isoforms, particularly AMPKα2, promote the remodeling of energy metabolism, ameliorate mitochondrial dysfunction, activate mitophagy, attenuate oxidative stress, and reduce cardiomyocyte death, thereby protecting cardiac function and delaying HF progression. Thus, drugs that selectively activate AMPK complexes containing specific α2 isoforms may help treat HF. However, AMPK activators are not currently very subtype-selective, direct agonists remain in clinical trials, and indirect agonists, although widely used in the clinic, have some non-AMPK-dependent effects. Therefore, a compelling need exists to develop subtype-selective activator drugs with greater specificity and efficacy and fewer side effects.
Depression is a highly prevalent mental disorder worldwide and is often accompanied by various somatic symptoms. Clinical studies have suggested a close association between depression and cardiac electrophysiological instability, particularly sudden cardiac death (SCD) and arrhythmias. Therefore, this review systematically evaluated the association between depression and the risks of SCD, atrial fibrillation (AF), and ventricular arrhythmias.
This analysis was conducted in accordance with the Preferred Reporting Items for Systematic reviews and Meta-Analyses guidelines. The PubMed, Embase, Web of Science, China National Knowledge Infrastructure, VIP, and Wanfang databases were comprehensively searched to identify studies that indicated a correlation between depression and the risk of SCD and arrhythmias from database inception until April 10, 2025. Numerous well-qualified cohort studies were incorporated in this analysis. Correlation coefficients were computed using a random effects model. Statistical analyses were performed using Review Manager 5.4 and STATA 16.0.
A total of 20 studies were included in this meta-analysis. We explored the relationship between depression and SCD as well as arrhythmias. Of these diseases, SCD exhibited a statistically significant association with depression (hazard ratio (HR), 2.52, 95% confidence interval (CI): 1.82–3.49). Ventricular tachycardia (VT)/ventricular fibrillation (VF) was also significantly correlated with depression (HR): 1.38, 95% CI: 1.03–1.86). Depression was also considerably more likely to develop following AF. The results also indicated that AF recurrence (HR: 1.89, 95% CI: 1.54–2.33) was more significant than new-onset AF (HR: 1.10, 95% CI: 0.98–1.25).
This study highlights a significant association between depression and elevated risks of SCD and arrhythmias, including both AF and VT/VF. These findings underscore the importance of incorporating mental health evaluation into comprehensive cardiovascular risk management strategies.
CRD42024498196, https://www.crd.york.ac.uk/PROSPERO/view/CRD42024498196.
Glucagon-like peptide-1 receptor agonists (GLP-1RAs) are a promising new class of drugs, whose clinical potential has recently been explored. Various preclinical studies and clinical trials initially demonstrated the efficacy of GLP-1RAs in treating type 2 diabetes mellitus (T2DM). However, long-term clinical practice has revealed that GLP-1RAs also exhibit significant efficacy and preventive effects in cardiovascular diseases. These effects are mediated through multiple gene pathways; thus, these drugs have shown substantial potential for further development in different clinical contexts. Cardiomyopathy, which constitutes a significant proportion of cardiovascular-related diseases, is increasingly prevalent, with its incidence rising annually. Thus, following the recent surge in research on cardiomyopathy, this review aims to summarize the latest findings regarding the association between GLP-1RAs and cardiomyopathy. This review begins with an introduction to GLP-1RAs, discussing their specific mechanisms of action. This article then addresses the pathogenesis, progression, and mechanisms of cardiomyopathy. Subsequently, a detailed analysis of the relationship between GLP-1RAs and cardiomyopathy is conducted. Finally, this review summarizes and discusses the latest literature on the impact of GLP-1RAs on the risk of various types of cardiomyopathy, as well as the potential underlying biological mechanisms, to provide clinical guidance on the use of GLP-1RAs in the treatment of cardiomyopathy.
To explore the potential categories of compliance development track of dual antiplatelet therapy (DAPT) after percutaneous coronary intervention (PCI) in patients with acute coronary syndrome (ACS) using growth mixture modeling (GMM) to analyze its predictive factors, providing evidence for dynamic adherence monitoring and tailored interventions.
A total of 150 patients with ACS after PCI were selected by convenience sampling. Patients were studied using Self-Efficacy for Appropriate Medication Use Scale (SEAMS), family APGAR index (APGAR), Generalized Anxiety Disorder-2 (GAD-2), and Patient Health Questionnaire-2 (PHQ-2) at baseline. The compliance of patients with DAPT was assessed using Morisky Medication Adherence Scales-8 (MMAS-8) at 1, 3, 6, 9, and 12 months after discharge. The mixed model of latent variable growth was used to identify the development track of compliance. Multiple logistic regression was used to analyze the predictive factors of different development track categories.
Two development track categories of DAPT compliance in patients with ACS after PCI were identified in the low compliance-decreased group (7.41%) and the persistent high compliance group (92.59%). Multivariate logistic regression analysis showed that age ≥60 years, body mass index (BMI), and the family APGAR index were the predictive factors of different development track categories of DAPT compliance in patients with ACS after PCI.
Significant population heterogeneity was observed in the development track of DAPT in ACS patients within 12 months after PCI. The compliance of most patients remained stable, and only a few remained at a low level and showed a significant downward trend. Based on these predictive factors, healthcare personnel can identify patients in the low compliance–decreased group early and implement targeted and specific interventions to improve DAPT compliance of ACS patients after PCI.
The Oxidative Balance Score (OBS) is a new measure for assessing systemic oxidative stress, where higher scores indicate increased exposure to antioxidants. However, the relationship between the OBS and mortality in individuals with hypertension remains unclear.
This study evaluated 8151 hypertensive individuals from the National Health and Nutrition Examination Survey (NHANES) (2001–2018), utilizing data from the National Death Index, tracked through December 31, 2019. The association between OBS and mortality (cardiovascular and all-cause) was examined using multivariable Cox regression models.
During a median follow-up of 9.7 years, which included 1692 deaths (461 of which were cardiovascular), multivariable Cox regression showed the highest quartile of OBS had significantly lower rates of all-cause mortality (hazard ratio (HR) 0.761, 95% CI: 0.635–0.912) and cardiovascular mortality (HR 0.553, 95% CI: 0.388–0.788) compared to those in the lowest quartile. An increase of one unit in the OBS was associated with a 1.9% reduction in all-cause mortality risk and a 3.7% reduction in cardiovascular mortality risk. This relationship remained consistent across various subgroup analyses, and spline regression supported a linear inverse trend.
For adults with hypertension, an elevated OBS is independently associated with a lower risk of mortality both from all-cause and cardiovascular diseases, suggesting that higher antioxidant levels may be protective.
Identifying the etiology of acute ischemic stroke (AIS) is critical for secondary prevention and treatment choice in stroke patients. This study aimed to investigate the dual-energy computed tomography (DECT) quantitative thrombus parameters associated with cardioembolic (CE) stroke and develop a nomogram that combines DECT and clinical data to identify CE stroke.
We retrospectively reviewed all consecutive patients from January 2020 to July 2022 with anterior circulation stroke and proximal intracranial occlusions. Patients were divided into CE stroke and non-cardioembolic (NCE) stroke groups according to the Trial of Org 10172 in Acute Stroke Treatment (TOAST) criteria. Univariable and multivariable logistic analyses were conducted, and a nomogram was developed by combining clinical and DECT variables. This nomogram was subsequently validated using an independent internal cohort of patients.
A total of 96 patients were analyzed, of which 43 (45%) were diagnosed with CE stroke. The multivariable analysis identified the following factors as being independently associated with CE stroke: normalized iodine concentration (NIC) (per 10-2 unit increase) (odds ratio (OR) = 1.598, 95% CI: 1.277–1.998; p < 0.001), gender (OR = 0.113, 95% CI: 0.028–0.446; p = 0.002), hypertension (OR = 0.204, 95% CI: 0.054–0.770; p = 0.019), and baseline National Institutes of Health Stroke Scale (NIHSS) (OR = 1.168, 95% CI: 1.053–1.296; p = 0.003). The matching nomogram displayed an area under the curve (AUC) of 0.929 in the study sample (n = 96) and 0.899 in the validation cohort (n = 29).
A nomogram that combines clinical and DECT variables can display good diagnostic performance for CE stroke.
Intrinsic capacity (IC) is defined as the combination of all physical and mental (including psychosocial) capacities that an individual can rely on at any given time. Previous studies have shown that a decline in IC is linked to an increased mortality rate. Thus, this study aimed to evaluate the impact of IC on the 5-year mortality of older people with cardiovascular disease.
This was a prospective cohort study conducted at a tertiary-level A hospital in China between September 2018 and April 2019, with a follow-up period of 5 years. We applied a proposed IC score to assess the baseline IC of each participant. The primary clinical outcome was 5-year all-cause mortality.
A total of 524 older patients (mean age, 75.2 ± 6.5 years; 51.7% men) were enrolled from the cardiology ward. A total of 86 patients (16.5%) experienced all-cause mortality over the 5-year follow-up period. Compared with the survival group, patients in the mortality group were older (81.1 ± 5.7 vs. 74.0 ± 6.0; p < 0.01), showed a higher male proportion (61.6% vs. 49.8%; p = 0.04), had a lower intrinsic score [7.0 (6.0, 8.0) vs. 8.0 (7.0, 9.0); p < 0.01], and a higher prevalence rates of atrial fibrillation or atrial flutter (34.9% vs. 20.1%; p < 0.01), heart failure (44.2% vs.11.2%; p < 0.01), diabetes (48.8% vs. 33.1%; p < 0.01), and chronic kidney disease (19.8% vs. 4.3%; p < 0.01). After adjusting for covariates, multivariate Cox regression showed that the IC score was associated with a lower hazard ratio of 5-year all-cause mortality (hazard ratio (HR) = 0.79, 95% confidence interval (CI): 0.69–0.92, p < 0.01).
Among these older aged patients with cardiovascular disease, the IC score is independently associated with 5-year all-cause mortality, with a lower IC score indicating a poorer prognosis.
ChiCTR1800017204; date of registration: 07/18/2018. URL: https://www.chictr.org.cn/showproj.html?proj=28931.
The causal relationship between migraines and patent foramen ovale (PFO) remains controversial, and a major unresolved question is how to define migraines attributable to PFO. Thus, this study aimed to determine if brain lesions could be a potential indicator of PFO-related migraines.
Consecutive migraine patients from 2017 to 2019 who underwent transthoracic echocardiography or transcranial Doppler examination with an agitated saline contrast injection were assessed for right-to-left shunts. We then presented diffusion-weighted imaging (DWI) in brain magnetic resonance imaging and its association with PFO in the included patients.
A total of 424 patients with a mean age of 44.39 ± 12.06 years were included in this retrospective study. Among them, 244 patients (57.5%) had PFO, and 246 patients (58%) had subclinical brain lesions—the brain lesions presented as single or multiple scattered lesions. No association was observed between PFO prevalence and brain lesions in the total cohort (odds ratio (OR) 0.499); however, a significant association was observed in patients aged less than 46 years (OR, 3.614 in the group aged <34 years, 95% confidence interval (CI) 1.128–11.580, and 3.132 in the group of 34 years ≤ age < 46 years, 95% CI 1.334–7.350, respectively). Lesions in patients with PFO observed using DWI came more from the anterior or multiple than the posterior vascular territory (p = 0.033). DWI lesion numbers, location, and right-to-left shunt amounts did not affect the association between DWI-observed lesions and PFO.
This study demonstrated that subclinical brain lesions are associated with PFO and may be used as a potential predictor of PFO-related migraines in patients aged less than 46 years. This may help identify candidate patients for PFO closure in future clinical decisions.
Cardiovascular diseases (CVDs) are the main cause of mortality worldwide, with coronary artery disease (CAD) noted as one of the major causes of CVD. An early and accurate diagnosis is important for improved outcomes in CAD patients. Invasive coronary angiography and coronary computed tomography angiography are accurate diagnostic tools for CAD. However, these examination methods possess limitations, including invasiveness and use of ionizing radiation, which limit their application in certain population groups. Meanwhile, coronary magnetic resonance angiography (CMRA) represents a noninvasive method that provides high-resolution coronary artery images without ionizing radiation and contrast agents. Nonetheless, the quality of CMRA images depends on numerous physiological and technical factors. This review analyzes the main factors that affect CMRA image quality and provides theoretical and technical insights for better clinical application of CMRA in CAD diagnoses.
To examine the predictive value of the Timed Up and Go test (TUGT) for five-year mortality among older patients with cardiovascular disease (CVD).
This prospective cohort study was conducted at the Beijing Hospital in China from September 2018 to April 2019, with a follow-up period of 5 years. Patients underwent the TUGT at baseline and were categorized into two groups based on the subsequent results: Group 1 (TUGT >15 s) and Group 2 (TUGT ≤15 s). The primary outcome of the study was all-cause mortality over five years.
The study included 491 older patients from the cardiology ward (average age 74.83 ± 6.38 years; 50.92% male). A total of 69 patients (14.05%) died over the five-year follow-up period. Patients in Group 1 were significantly older (78.36 ± 6.39 vs. 73.47 ± 5.83; p < 0.001) and exhibited higher prevalence rates of heart failure (HF) (21.17% vs. 11.86%; p = 0.009) and stroke or transient ischemic attack (TIA) (24.09% vs. 12.15%; p = 0.001) compared to those in Group 2. After adjusting for covariates, multivariate Cox regression analysis revealed that a TUGT >15 s in CVD patients was significantly associated with an elevated hazard ratio for five-year all-cause mortality (hazard ratio (HR): 2.029; 95% confidence interval (CI): 1.198–3.437; p = 0.004).
The TUGT is independently associated with 5-year all-cause mortality among older patients with CVD, with a TUGT >15 s indicating a poorer prognosis.
ChiCTR1800017204; date of registration: 07/18/2018. URL: https://www.chictr.org.cn/showproj.html?proj=28931.
Despite advances in treatment and the potential role of serum albumin as a prognostic biomarker, the mortality rate of individuals with coronary heart disease (CHD) continues to increase. Thus, this study aimed to assess the relationship between serum albumin levels and the risk of all-cause mortality and cardiovascular death in individuals with CHD.
This large-scale retrospective cohort study included 1556 participants diagnosed with CHD from the National Health and Nutrition Examination Survey spanning 1999 to 2015. We conducted multivariate Cox regression, subgroup and sensitivity analyses, and restricted cubic spline (RCS) plots to examine the link between serum albumin levels and all-cause mortality and cardiovascular death.
After gradually adjusting the confounding variables, serum albumin consistently demonstrated a strong link to increased overall and cardiovascular-related mortality risk when employed as a continuous variable (hazard ratio [HR]: 0.938, 95% confidence interval [CI]: 0.912–0.964; p < 0.001; HR: 0.921, 95% CI: 0.884–0.960; p < 0.001; respectively); meanwhile, serum albumin as a three-category variable, with Tertile 1 (T1, ≤40 g/L), Tertile 2 (T2, 40–43 g/L), and Tertile 3 (T3, >43 g/L), was only closely related to the risk of all-cause death (T2 vs. T1, HR: 0.771, 95% CI: 0.633–0.939; p = 0.010; T3 vs. T1, HR: 0.761, 95% CI: 0.612–0.947; p = 0.014; respectively). Subgroup analysis showed that serum albumin was linked to all-cause mortality across most groups (≤60 or >60 years, male or female, and without hypertension, diabetes, or chronic kidney disease); however, its correlation with cardiovascular death was observed only in the subgroup without hypertension (p < 0.05). The sensitivity analysis indicated that excluding participants with an estimated glomerular filtration rate <30 mL/min/1.73 m2 did not alter the association between serum albumin and the risk of all-cause and cardiovascular mortality. Moreover, the RCS analysis further supported a consistent negative linear trend between serum albumin levels and mortality risks (p for nonlinearity >0.05).
The serum albumin levels in individuals with CHD were inversely and linearly related to all-cause mortality and cardiovascular death risk.
Intravascular optical coherence tomography (OCT) has represented a revolutionary invasive imaging method, offering in vivo high-resolution cross-sectional views of human coronary arteries, thereby promoting a significant evolution in the understanding of vascular biology in both acute and chronic coronary pathologies. Since the development of OCT in the early 1990s, this technique has provided detailed insights into vascular biology, enabling a more thorough assessment of coronary artery disease (CAD) and the impact of percutaneous coronary intervention (PCI). Moreover, a series of recent clinical trials has consistently demonstrated the clinical benefits of intravascular imaging (IVI) and OCT-guided PCI, showing improved outcomes compared to angiography-guided procedures, particularly in cases of complex coronary pathology. Nonetheless, despite the advantages of OCT, several limitations remain, including limited penetration depth and the necessity for additional contrast agent administration, which may potentially constrain the widespread adoption of OCT. Moreover, economic and logistical challenges remain, including heterogeneous levels of training among interventional cardiologists, which leads to the underutilization of OCT in the Western world. Meanwhile, emerging technologies and the integration of machine learning and artificial intelligence-based algorithms are set to enhance diagnostic accuracy in daily practice. Future research is necessary to address existing limitations and investigate next-generation devices, further advancing the field of interventional cardiology toward optimal imaging-guided PCI and improved outcomes.
Coronary microvascular disease has been found to increase the incidence of the composite endpoint for cardiovascular events and affect coronary revascularization. Coronary microvascular disease is often accompanied by epicardial disease, and despite successful revascularization and optimal medications, coronary microvascular disease may lead to reduced exercise tolerance and worsening clinical symptoms. Moreover, despite advances in percutaneous coronary intervention for coronary revascularization, the management of microvascular obstruction in reperfused myocardial tissue remains challenging and is a high-risk procedure. Previous studies have identified the coronary venous system as a new avenue for treating coronary microvascular obstructions associated with revascularization. Current data suggest that coronary sinus interventions, which primarily include coronary sinus reducer and pressure-controlled intermittent coronary sinus occlusion interventions, can provide significant clinical aid in 70–80% of patients with refractory angina pectoris and acute myocardial infarction who suffer from microvascular disease with no possibility of revascularization by modulating coronary venous pressures. However, a recent randomized trial demonstrated no difference in infarct size reduction between the pressure-controlled intermittent coronary sinus occlusion-assisted and conventional primary percutaneous coronary intervention groups. This article reviews recent advancements in coronary sinus-based therapeutic approaches for coronary microvascular disease.
This study aimed to investigate the performance of two versions of ChatGPT (o1 and 4o) in making decisions about coronary revascularization and to compare the recommendations of these versions with those of a multidisciplinary Heart Team. Moreover, the study aimed to assess whether the decisions generated by ChatGPT, based on the internal knowledge base of the system and clinical guidelines, align with expert recommendations in real-world coronary artery disease management. Given the increasing prevalence and processing capabilities of large language models, such as ChatGPT, this comparison offers insights into the potential applicability of these systems in complex clinical decision-making.
We conducted a retrospective study at a single center, which included 128 patients who underwent coronary angiography between August and September 2024. The demographics, medical history, current medications, echocardiographic findings, and angiographic findings for each patient were provided to the two ChatGPT versions. The two models were then asked to choose one of three treatment options: coronary artery bypass grafting (CABG), percutaneous coronary intervention (PCI), or medical therapy, and to justify their choice. Performance was assessed using metrics such as accuracy, sensitivity, specificity, precision, F1 score, Cohen's kappa, and Shannon's entropy.
The Heart Team recommended CABG for 78.1% of the patients, PCI for 12.5%, and medical therapy for 9.4%. ChatGPT o1 demonstrated higher sensitivity in identifying patients who needed CABG (82%) but lower sensitivity for PCI (43.7%), whereas ChatGPT 4o performed better in recognizing PCI candidates (68.7%) but was less accurate for CABG cases (43%). Both models struggled to identify patients suitable for medical therapy, with no correct predictions in this category. Agreement with the Heart Team was low (Cohen's kappa: 0.17 for o1 and 0.03 for 4o). Notably, these errors were often attributed to the limited understanding of the model in a clinical context and the inability to analyze angiographic images directly.
While ChatGPT-based artificial intelligence (AI) models show promise in assisting with cardiac care decisions, the current limitations of these models emphasize the need for further development. Incorporating imaging data and enhancing comprehension of clinical context is essential to improve the reliability of these AI models in real-world medical settings.
Coronary heart disease (CHD) is associated with increased morbidity and mortality. Acute cardiovascular events frequently occur in patients with coronary artery stenoses exceeding 70%. Although coronary revascularization can significantly improve ischemic symptoms, the inflection point for reducing mortality from CHD has yet to be reached. Therefore, the prevention and treatment of mild-to-moderate coronary artery stenosis should be given significant attention to more effectively reduce the incidence and mortality of acute events from CHD. Subsequently, a stenosis of less than 70% is used to characterize the incidence of mild to moderate coronary artery stenosis. While acute cardiovascular events caused by soft plaque and plaque rupture may not have a significant impact on hemodynamics, these events are detrimental and result in increased mortality. This review summarizes the methods available for detecting mild-to-moderate coronary artery stenoses, assessing risk, and understanding the mechanisms underlying adverse events. Moreover, this review proposes intervention strategies for preventing and treating mild to moderate coronary stenosis.
Valvular heart disease (VHD), including both non-rheumatic valvular heart disease (NRVHD) and rheumatic valvular heart disease (RVHD), is a major global health concern. Moreover, the progression of VHD to heart failure (HF) poses substantial clinical and public health challenges. In light of the global population aging, alongside increasing cardiovascular risk factors, and the additional strain imposed by the COVID-19 pandemic, a timely reassessment of the VHD-related HF burden is urgently needed. Using the most recent data from the Global Burden of Disease (GBD) Study 2021, this study aimed to evaluate the distribution of VHD-related HF burden in 2021, examining the long-term trends from 1990 to 2021, and short-term changes between 2019 and 2021, to provide updated insights to inform future prevention and management strategies.
Using GBD 2021 data, we analyzed the distribution of VHD-related HF burden in age-standardized prevalence rates across the Group of Twenty (G20) countries.
The highest NRVHD-related HF burden in 2021 was observed in the United States (US), Italy, and Russia, while the highest RVHD-related HF burden was noted in India, France, and China. Over the past 30 years (1990–2021), the NRVHD-related HF burden decreased in developed countries (e.g., the US, Canada, Japan) but increased in emerging economies (e.g., India, Brazil, South Africa), with significant increases also observed in Argentina, Mexico, Brazil, among other countries. Notably, nearly all G20 countries exhibited a downward trend in RVHD-related HF burden, with Germany and Australia being the exceptions. During the COVID-19 pandemic (2019–2021), the NRVHD-related HF burden declined in most G20 nations, except for South Africa, India, and a few others, while the RVHD-related HF burden increased slightly in countries such as Mexico, Russia, and Indonesia.
Trends in NRVHD- and RVHD-related HF burden across G20 countries exhibited notable variations, and these became more pronounced under the impact of the COVID-19 pandemic. These findings underscore the importance of developing long-term strategies to enhance the resilience of healthcare systems, improve chronic disease management, and optimize resource allocation to promote cardiovascular health and preparedness for public health challenges.
Postoperative atrial fibrillation (POAF) commonly occurs following surgical repair of degenerative mitral regurgitation (DMR) and is associated with unfavorable outcomes. This study aimed to identify preoperative risk factors for acute POAF in patients undergoing mitral valve repair for DMR, with a specific focus on the role of preoperative echocardiography.
A retrospective study was conducted involving 1127 DMR patients who underwent mitral valve repair between 2017 and 2022. The primary endpoint was the occurrence of acute POAF within 30 days after surgery. Univariate and multivariate logistic regression analyses were performed to identify risk factors for POAF. Additionally, subgroup analyses were conducted to evaluate the predictive value of preoperative parameters for the development of acute POAF.
Acute POAF was observed in 152 patients (13.5%). After adjusting for covariates, multivariate analysis revealed that age (odds ratio (OR) 1.05; 95% confidence interval (CI) 1.03–1.07, p < 0.001), hypertension (OR 1.50; 95% CI 1.03–2.21, p = 0.037), left ventricular ejection fraction (OR 0.95; 95% CI 0.92–0.98, p = 0.004), and left atrial enlargement (OR 1.03; 95% CI 1.00–1.06, p = 0.019) were independent predictors of acute POAF. The interventricular septum (IVS) thickness demonstrated a strong association with acute POAF (OR 1.21; 95% CI 1.06–1.38, p = 0.005). The optimal cut-off value for the IVS thickness in predicting acute POAF was 11.0 mm. The adjusted OR of association between an IVS thickness >11 mm and acute POAF was 1.73 (95% CI 1.03–2.89, p = 0.037). The IVS thickness was consistently identified as a significant predictor of POAF in the subgroup analyses.
Preoperative assessment of clinical morbidity and echocardiographic parameters, particularly IVS thickness, can be valuable in identifying high-risk patients for acute POAF and informing targeted strategies for prevention and management.
Left ventricular noncompaction (LVNC), also called noncompaction cardiomyopathy (NCM), is a myocardial disease that affects children and adults. Morphological features of LVNC include a noncompacted spongiform myocardium due to the presence of excessive trabeculations and deep recesses between prominent trabeculae. Incidence and prevalence rates of this disease remain contentious due to varying clinical phenotypes, ranging from an asymptomatic phenotype to fulminant heart failure, cardiac dysrhythmias, and sudden death. There is a strong genetic component associated with LVNC, and nearly half of pediatric LVNC patients harbor an identifiable genetic mutation. Recent studies have identified LVNC-associated mutations in genes involved in intercellular trafficking and cytoskeletal integrity, in addition to well-known mutations causing abnormal cardiac embryogenesis. Currently, the diagnosis is based on symptoms, as well as various diagnostic criteria, including echocardiography, electrocardiograms, and cardiac magnetic resonance imaging. Meanwhile, clinical management is primarily focused on the prevention of complications, such as heart failure, thromboembolic events, life-threatening arrhythmias, and stroke. Continued research is focusing on the genetic etiology, the development of gold-standard diagnostic criteria, and evidence-based treatment guidelines across all age groups. This review article will highlight the genotype–phenotype relationship within pediatric LVNC patients and assess the latest discoveries in genetic and molecular research aimed at improving their diagnostic and therapeutic management.
Despite continued advancements in transcatheter aortic valve implantation (TAVI) techniques, the incidence of permanent pacemaker implantation (PPI) remains substantial. Established predictors of PPI include advanced age, pre-existing electrocardiographic conduction abnormalities, prosthetic valve type, implantation depth, and anatomical parameters, such as membranous septum length, which are currently under active investigation. In routine clinical practice, the management strategy often involves the temporary placement of a transvenous pacemaker lead, followed by a period of observation. While widely implemented, this approach introduces clinical uncertainty and may contribute to prolonged hospitalization, particularly given the not infrequent occurrence of delayed high-degree atrioventricular (AV) block. A novel diagnostic method emerging from electrophysiological evaluation is rapid atrial pacing performed post-TAVI, which aims to assess susceptibility to Wenckebach-type AV block. Two observational studies have evaluated this technique, utilizing an upper pacing threshold of 120 beats per minute as a cutoff to identify patients at risk of requiring permanent pacing. Moreover, this method is cost-effective, technically straightforward, and time-efficient; preliminary findings suggest this technique possesses a high negative predictive value. However, additional prospective data are required to validate the clinical utility of this technique and inform the development of standardized implementation. An upcoming clinical study (NCT06189976) is anticipated to provide valuable insights.
Myocarditis is a life-threatening inflammatory disorder that affects the cardiac muscle tissue. Current treatments merely regulate heart function but fail to tackle the root cause of inflammation. In myocarditis, the initial wave of inflammation is characterized by the presence of neutrophils. Subsequently, neutrophils secrete chemokines and cytokines at the site of heart tissue damage to recruit additional immune cells and regulate defense responses, thereby exacerbating myocarditis. Recent discoveries showing neutrophil extracellular traps (NETs) and their components not only reinforce the proinflammatory functions of neutrophils, inducing enhanced interleukin (IL)-8 secretion, but also induce monocyte/macrophage activation, differentiation, and phagocytic function through the inflammasome pathway. The inflammasome cascade triggers a positive feedback loop through the secretion of proinflammatory cytokines, which leads to further neutrophil activation and degranulation, NET release, monocyte and macrophage infiltration, tissue degradation, and myocardial damage, indicating that neutrophils promote myocarditis-induced cardiac necrosis and an anti-cardiac immune response. In addition, neutrophils can induce oxidative stress and damage cellular structures by releasing excess reactive oxygen species (ROS), thus exacerbating tissue damage in myocarditis. Meanwhile, the recruitment of cells, which is facilitated by neutrophil-secreted chemokines, and the consumption of cells through neutrophil phagocytosis can form a closed loop that continuously maintains a proinflammatory state. This review summarizes the role of neutrophil secretion, phagocytosis and their relationship in myocarditis, and discusses the function of certain agents, such as chemokine antagonists, midkine blockers and neutrophil peptidyl arginine deiminase 4 (PAD4) inhibitors in inhibiting neutrophil secretion and phagocytosis, to provide perspective for myocarditis treatments through the inhibition of neutrophil secretion and phagocytosis.
Blood culture-negative infective endocarditis (BCNE) constitutes an important subtype of infective endocarditis. Despite the rarity of BCNE, this subtype poses a significant diagnostic challenge and promotes a high mortality rate. Recent advances in diagnostic modalities have facilitated the rapid identification of BCNE. Moreover, empiric diagnostic and therapeutic approaches, supported by intensive and rigorous epidemiological and observational investigations, have yielded positive results. There is a growing inclination in clinical management toward early surgical interventions while rigorously assessing surgical risks, complications, and anticipated benefits. This review examines the epidemiology, microbiological data, and diagnoses of medical and surgical BCNE in contemporary practices.
This study aimed to determine the optimal dosages of prostaglandin E1 required to maintain a patent ductus arteriosus (PDA) in infants with transposition of the great arteries (TGA) based on point-of-care ultrasound (POCUS) findings.
Infants with TGA were recruited from two groups (the historical control group and the POCUS group that received POCUS in combination with pulse oximetry saturation (SpO2) to titrate the dose of prostaglandin E1 (PGE1)).
A total of 150 patients were included in this study. The mean gestational ages were 38.6 weeks and 38.9 weeks, respectively, and the mean birth weights were 3.09 kg and 3.23 kg, respectively, in the control and POCUS groups. The rate of PGE1 prescriptions in the control group (93.3%) was higher than in the POCUS group (71.1%; p < 0.001). The time at which PGE1 was initiated (prenatally diagnosed) was earlier than in the control group (0.05 ± 0.01 vs. 1.66 ± 3.72 d; p < 0.001). The proportion of patients using a low dose (less than 5 ng/kg⋅min) of PGE1 was higher in the POCUS group (40.6% vs. 8.9%; p < 0.001). The multivariate logistic regression analysis indicated that implementing POCUS significantly reduces the dosage of PGE1.
POCUS can optimize the use of PGE1, reduce unnecessary usage, postpone the initiation of PGE1, minimize the maintenance dose, and reduce the impact dose. POCUS guidance enhances the safety and effectiveness of PGE1 in infants with TGA.
Cardiovascular assessments in children and adolescents with hypertension are essential for detecting early signs of organ damage and guiding timely interventions. The pathophysiology of pediatric hypertension involves a complex interplay of arterial stiffness, endothelial dysfunction, metabolic disturbances, activation of the renin–angiotensin–aldosterone system, and immune dysregulation. These mechanisms collectively contribute to target organ damage, particularly in the cardiovascular system. Traditional office-based blood pressure measurements often fail to identify individuals at high risk, prompting the adoption of more advanced diagnostic techniques. Measures of arterial stiffness, such as pulse wave velocity, augmentation index, and cardio–ankle vascular index, provide valuable insights into vascular health and have been strongly associated with left ventricular hypertrophy and impaired heart function. Imaging modalities, including carotid intima-media thickness and epicardial adipose tissue measurements, serve as indicators of subclinical atherosclerosis and cardiovascular risk. Advanced echocardiographic tools that assess myocardial strain and ventricular–arterial coupling provide a more nuanced understanding of cardiac performance in hypertensive adolescents. These advanced techniques enhance the early detection of cardiovascular abnormalities and support a more individualized approach to managing pediatric hypertension. However, challenges related to validation, standardization, and clinical integration remain. Thus, expanding access to these modalities and refining their use in pediatric populations are crucial steps toward improving long-term cardiovascular outcomes in youth with elevated blood pressure.
Epicardial adipose tissue (EAT) is an indicator of high cardiovascular and metabolic risk. This study aimed to investigate the association between EAT thickness (EATT) and liver fibrosis and steatosis in patients with type 2 diabetes mellitus (T2DM) and metabolic dysfunction-associated steatotic liver disease (MASLD).
Patients with T2DM and MASLD underwent a complex evaluation, which included clinical, laboratory, and liver and transthoracic cardiac ultrasound assessments. The EATT was measured using the standard method. Liver fibrosis and steatosis were evaluated by several non-invasive indexes, through which patients with severe steatosis and advanced fibrosis were identified. Correlations between the EATT and markers of liver fibrosis and steatosis were evaluated by bivariate and multiple regression analyses.
In this study population of 267 T2DM patients with MASLD, the median EATT value was 7 mm. 43.8% of study patients had an EATT >7 mm. The EATT was higher in patients with advanced liver fibrosis (8.97 ± 2.88 mm vs. 7.09 ± 2.38 mm; p < 0.0001) and in those with more severe hepatic steatosis (7.69 ± 2.70 mm vs. 6.61 ± 1.88 mm; p = 0.0310). A higher percentage of patients with advanced liver fibrosis had an EATT of >7 mm (68.3% vs. 36.7%; odds ratio (OR) = 3.72 [95% confidence interval (CI): 2.02; 6.87]; p < 0.0001). In the bivariate analyses, the EATT significantly correlated with the markers of body adiposity, non-invasive indexes of liver steatosis and fibrosis, aspartate aminotransferase (ASAT), gamma glutamyl transpeptidase (GGT), diabetes duration, and pO2. The multiple regression analyses indicated that the EATT was independently associated with fibrosis-4 (FIB-4) score and body fat mass, and with serum ferritin (in fully adjusted models), while the correlation with the markers of hepatic steatosis became non-significant after adjustments for body adiposity.
T2DM patients with MASLD and markers of advanced liver fibrosis have higher EATT, which was independently associated with liver fibrosis.
Compared to patients with controllable hypertension, those with resistant hypertension (RH) have a higher incidence of cardiovascular complications, including stroke, left ventricular hypertrophy, and congestive heart failure. Therefore, an urgent need exists for improved management and control, along with more effective medications. Aldosterone synthase inhibitors (ASIs) are newly emerging drugs that have gradually attracted an increasing amount of attention.
The Cochrane Library, PubMed, Embase, and ClinicalTrials.gov databases were systematically searched to identify all literature on ASIs and resistant hypertension. Additionally, the reference lists of the included articles were manually searched. The quality of the identified studies was assessed using the Cochrane Bias Risk Tool.
This study comprised four randomized controlled trials (RCTs), involving 776 participants. Different doses of ASIs were used, with treatment durations ranging from 7 to 12 weeks. The selected study population included individuals with resistant hypertension and healthy adults. Systolic blood pressure (SBP) had a pooled effect size of standardized mean difference (SMD) = –0.24, with a 95% confidence interval (CI) of [–0.46, –0.03], indicating a statistically significant difference (p = 0.026); however, diastolic blood pressure (DBP) had a pooled effect size of SMD = –0.13, with a 95% CI of [–0.40, 0.15], indicating no significant difference (p = 0.359). Similarly, subgroup analyses yielded comparable results. Notably, the risk of adverse events in the ASI group was greater than that in the control group, with a risk ratio of 1.32 and a 95% CI of [1.04, 1.66], indicating a significant difference (p = 0.02). There was no statistically significant difference in severe adverse events between the treatment group and the control group (p = 0.532).
ASIs have shown benefits in controlling SBP in patients with resistant hypertension, although their effects on DBP appear to be limited. Given the observation period of only 12 weeks, the potential for increased adverse event risks with their use warrants further attention. Considering the relatively small number of trials included and the limited sample size in this study, future research should focus on expanding the sample size and extending the follow-up duration to more precisely define the clinical role and value of ASIs. Additionally, further investigation into the underlying mechanisms of action of these inhibitors is necessary to provide theoretical support for optimizing treatment strategies for resistant hypertension and related conditions.
Stress cardiomyopathy/Takotsubo syndrome (TTS) is a transient cardiac condition characterized by sudden and reversible left ventricular dysfunction, typically triggered by emotional or physical stress. The international TTS (InterTAK) score predicts the probability of suffering from TTS. However, the diagnostic algorithm includes three mutually exclusive diagnoses: acute coronary syndrome (ACS), TTS, and acute infectious myocarditis. Thus, we propose to include the conditions in which TTS is associated with ACS or myocarditis. While TTS is commonly associated with non-ischemic stressors, recent evidence has indicated that TTS can be found in patients with ACS. Nonetheless, in some cases, ACS may trigger rather than exclude TTS. Additionally, TTS could also prompt plaque ruptures in coronary arteries. Meanwhile, infections and conditions that cause myocarditis can also produce physical stress that may trigger TTS. Furthermore, TTS has been reported after confirmed viral myocarditis. This opinion article explores the intricate relationships between (i) TTS and ACS, and (ii) TTS and myocarditis, delving into the related pathophysiologies and diagnostic challenges. However, further research is required to elucidate the mechanisms that link TTS with these conditions.
Differences between female and male patients may influence the outcomes of transcatheter aortic valve replacement (TAVR). However, knowledge regarding known sex differences in TAVR procedures among Chinese people remains limited. Therefore, this study aimed to investigate the impact of sex-related differences on reverse left ventricular (LV) remodeling following TAVR in the Chinese population.
Patients with severe symptomatic aortic stenosis (AS) who underwent TAVR at the Heart Center of the Affiliated Zhongshan Hospital of Dalian University were enrolled. A total of 136 patients who underwent implantation of a self-expandable Venus A valve between 2019 and 2024 were evaluated. We retrospectively compared the clinical outcomes and characteristics of all patients by sex.
In our study, females presented with a smaller body surface area (BSA) (1.68 ± 0.15 m2 vs. 1.90 ± 0.14 m2, p < 0.001), aortic valve area (AVA) (0.64 ± 0.22 cm2 vs. 0.77 ± 0.20 cm2, p = 0.003), left ventricular end-diastole diameter (LVEDD) (49.72 ± 7.37 mm vs. 53.33 ± 8.36 mm, p = 0.023), as well as interventricular septum in diastole (IVSD) (12.85 ± 2.19 mm vs. 13.88 ± 2.61 mm, p = 0.034) at baseline. Comparatively, males had larger aortic root structures at baseline and a larger size of valve implantation during the procedure (p < 0.05). However, the indexed AVA was not significantly different between the two groups at baseline. Sex-specific outcomes, particularly AVA, LVEDD, aortic root diameter (AO), and IVSD, were significantly different during each follow-up within the first six months (p < 0.05), indicating that females experienced greater improvements in these echocardiographic characteristics after TAVR. Left ventricular ejection fraction (LVEF) only improved significantly at 1-month follow-up in females compared to males (57.77 ± 7.87% vs. 54.40 ± 8.21%, p = 0.037). Multivariable linear-regression analysis showed that being a female patient (Beta: 10.200; 95% CI: 0.075–20.326; p = 0.048), as well as having a higher IVSD (Beta: 2.939; 95% CI: 1.110–4.769; p = 0.002), and higher baseline left ventricular mass index (LVMi) (Beta: 0.409; 95% CI: 0.298–0.521; p < 0.001) were independently associated with greater mid-term LVMi regression post-TAVR.
Female patients with AS exhibited more favorable mid-term LV reverse remodeling post-TAVR compared to male patients in a Chinese population.
Coronary artery aneurysms (CAAs) are frequent entities that are encountered in up to 8% of patients undergoing coronary imaging. The most frequent cause of CAAs is atherosclerotic “positive remodeling” of coronary arteries, while congenital, inflammatory, and traumatic etiologies could also be seen. Aneurysms serve as foci for thrombus formation, which may occlude the aneurysmatic segment or embolize distally. Rupture of an aneurysm is a rare yet potentially catastrophic complication of a CAA. Most aneurysms can be managed medically, while percutaneous exclusion of an aneurysm from coronary circulation is appropriate for CAAs that are prone to rupture or thrombosis. Surgical correction remains the ultimate option for patients who are not amenable to percutaneous management or those with a compelling indication for surgery. This review summarizes the available knowledge on the nomenclature, classification, pathophysiology, diagnosis, and management of CAAs, with a particular emphasis on treatment strategies to mitigate the risks associated with CAAs.
Presently, the availability of single-stage surgical correction of mitral valve disease combined with atrial fibrillation (AF) via a mini-access approach remains limited. Moreover, the comparative effectiveness of this procedure versus conventional sternotomy (CS) remains poorly understood. Thus, this study aimed to conduct a comparative assessment of the efficacy and safety of concomitant mitral valve surgery and AF ablation via a minimally invasive approach (minimally invasive cardiac surgery, MICS group) versus the standard sternotomy approach (CS group).
An extensive literature search was performed to identify relevant studies. Additionally, for comparative analysis, we included isolated studies where the combined intervention was conducted exclusively via either minimally invasive or CS as the primary access.
Freedom from atrial arrhythmia (AA) for MICS and CS was 94.52% [95% CI 91.52, 96.50] vs. 80.76% [95% CI 67.19, 89.59] and 86.22% [95% CI 80.13, 90.66] vs. 86.33% [95% CI 79.39, 91.19] at 1 and 2 years, respectively, with no statistically significant differences. Meanwhile, cardiopulmonary bypass (CPB) and aortic cross-clamp (ACC) times were significantly longer in the MICS group compared to CS (CPB: 151.50 vs. 120.01 min; ACC: 112.36 vs. 101.43 min; p < 0.001). There were no differences in mortality between groups (p = 0.709). The rate of pacemaker implantation was significantly higher in the CS group (MICS: 3.32% [95% CI 1.58, 6.87] vs. CS: 5.20% [95% CI 2.80, 9.46]; p < 0.001).
This meta-analysis found that the minimally invasive approach was associated with longer CPB and ACC times but a lower rate of pacemaker implantation, with no significant differences observed in mortality and freedom from AA at 1 and 2 years.
CRD42024570022, https://www.crd.york.ac.uk/PROSPERO/view/CRD42024570022.
Heart failure with reduced ejection fraction (HFrEF) is a progressive condition that is associated with high rates of morbidity, frequent hospitalizations, and significant mortality. Despite advancements in guideline-directed medical therapy (GDMT), many patients continue to be at risk for worsening heart failure (WHF). Vericiguat is a novel soluble guanylate cyclase (sGC) stimulator that targets the impaired nitric oxide (NO)–sGC–cyclic guanosine monophosphate (cGMP) pathway. Thus, by improving vascular and myocardial function, vericiguat offers a promising therapeutic option for patients with HFrEF who remain symptomatic despite receiving optimal medical treatment. This review explores the pathophysiological rationale, mechanism of action, and clinical evidence supporting the use of vericiguat. We analyze data from key randomized controlled trials (RCTs), such as SOCRATES-REDUCED and VICTORIA, as well as meta-analyses, to assess the efficacy and safety of using vericiguat in HFrEF. Additionally, we review real-world studies to evaluate the applicability of vericiguat in clinical practice.
Medical devices for tricuspid regurgitation have emerged as viable treatment options for patients who do not respond to drug therapy or who are unsuitable for open-heart surgery due to high surgical risk. Recently, numerous new medical devices have been proposed and approved for use. Therefore, comprehensive reviews of the literature on the current medical devices for tricuspid regurgitation are necessary. This paper subsequently describes all medical devices used for transcatheter tricuspid valve interventions, providing an updated overview of the current options for managing tricuspid regurgitation, a common valvular heart disease associated with changes in the configuration and function of the tricuspid valve. Over 70 million people worldwide suffer from tricuspid regurgitation, with an estimated mortality rate of 0.51 deaths per 10,000 person-years. However, delays in diagnosis and treatment frequently contribute to disease progression. Meanwhile, the growing health and economic burden of tricuspid regurgitation has led to the urgent need for new therapeutic strategies to overcome the limitations of pharmacological and surgical approaches. In this scenario, transcatheter tricuspid valve interventions represent a promising option for patients with severe tricuspid regurgitation, considered inoperable due to excessive surgical risk. Medical devices designed for these innovative approaches are classified into two main groups: transcatheter tricuspid valve repair and replacement systems. This review presents the technological characteristics of medical devices and the results of studies on their clinical efficacy and safety, thereby supporting the use of transcatheter tricuspid valve repair/replacement systems in clinical practice.
Harlequin syndrome, also known as differential hypoxia (DH) or North-South syndrome, is a serious complication of femoro-femoral venoarterial extracorporeal membrane oxygenation (V-A ECMO). Moreover, Harlequin syndrome is caused by competing flows between the retrograde oxygenated ECMO output and the anterograde ejection of poorly oxygenated blood from the native heart. In the setting of impaired pulmonary gas exchange, the addition of an Impella device (ECPELLA configuration), although beneficial for ventricular unloading and hemodynamic support, may further exacerbate this competition and precipitate DH. This narrative review synthesizes current evidence on the pathophysiology, diagnostic strategies, and management of DH in patients supported with V-A ECMO or with ECPELLA. Meanwhile, the timely detection of Harlequin syndrome is essential to prevent cerebral and myocardial hypoxia. Current diagnostic approaches include right radial arterial pressure monitoring, multisite arterial blood gas analysis, cerebral oximetry, and echocardiographic evaluation of flow dynamics. Interestingly, emerging tools such as contrast-enhanced ultrasound (CEUS) and suprasternal transthoracic echocardiography (TTE) show promise for non-invasive bedside identification of flow competition. However, further management of DH requires tailored strategies aimed at restoring adequate oxygen delivery while preserving sufficient ventricular ejection or Impella support. Moreover, circuit reconfiguration remains a key rescue option when conventional optimization fails. This review highlights that successful treatment depends on integrating real-time physiological data with a dynamic understanding of circulatory support, emphasizing the need for multidisciplinary expertise in managing this complex syndrome.
Adult congenital heart disease (ACHD) constitutes a heterogeneous and expanding patient cohort with distinctive diagnostic and management challenges. Conventional detection methods are ineffective at reflecting lesion heterogeneity and the variability in risk profiles. Artificial intelligence (AI), including machine learning (ML) and deep learning (DL) models, has revolutionized the potential for improving diagnosis, risk stratification, and personalized care across the ACHD spectrum. This narrative review discusses the current and future applications of AI in ACHD, including imaging interpretation, electrocardiographic analysis, risk stratification, procedural planning, and long-term care management. AI has been demonstrated as being highly accurate in congenital anomaly detection by various imaging modalities, automating measurement, and improving diagnostic consistency. Moreover, AI has been utilized in electrocardiography to detect previously undetected defects and estimate arrhythmia risk. Risk-prediction models based on clinical and imaging information can estimate stroke, heart failure, and sudden cardiac death as outcomes, thereby informing personalized therapy choices. AI also contributes to surgery and interventional planning through three-dimensional (3D) modelling and image fusion, while AI-powered remote monitoring tools enable the detection of early signals of clinical deterioration. While these insights are encouraging, limitations in data availability, algorithmic bias, a lack of prospective validation, and integration issues remain to be addressed. Ethical considerations of transparency, privacy, and responsibility should also be highlighted. Thus, future initiatives should prioritize data sharing, explainability, and clinician training to facilitate the secure and effective use of AI. The appropriate integration of AI can enhance decision-making, improve efficiency, and deliver individualized, high-quality care to ACHD patients.
The hemoglobin, albumin, lymphocyte, and platelet (HALP) score represents a meaningful predictor in many cardiovascular diseases. However, the predictive utility of this score for the outcome of patients admitted to the intensive care unit (ICU) due to acute myocardial infarction (AMI) has yet to be fully elucidated.
Information from the Medical Information Mart for Intensive Care (MIMIC)-IV v3.1 database was used to analyze the association between the HALP score and 90 days and 365 days all-cause mortality in critically ill patients with AMI. Patients were grouped according to the calculated HALP quartiles. Cox proportional hazards regression analysis and restricted cubic spline (RCS) analysis were performed to assess the association between the HALP score and mortality risk. A recursive algorithm identified the HALP inflection point, thus defining high and low HALP groups for the Kaplan–Meier survival analysis. Subgroup analyses analyzed the robustness across clinical strata. Furthermore, predictive models based on machine learning algorithms that included the HALP score were constructed to estimate 90 days mortality. The performance of the models was evaluated using the area under the receiver operating characteristic curve (AUC).
A total of 818 AMI patients were included. The analysis revealed mortality rates of 31% at 90 days and 40% at 365 days. Elevated HALP values were independently linked to a reduced risk of death. In fully adjusted models, patients in the top HALP quartile exhibited significantly lower all-cause mortality at 90 days (hazard ratio (HR) = 0.68; 95% confidence interval (CI): 0.47–0.99; p = 0.047) and 365 days (HR = 0.66; 95% CI: 0.47–0.90; p = 0.011). A nonlinear, inverse “L-shaped” association was observed, with an inflection point identified at a HALP value of 19.41. Below this value, each unit increase in the HALP score reduced mortality risk by 2.4%–2.7%. The Kaplan–Meier curves confirmed an improved survival above the threshold. Meanwhile, the subgroup analyses revealed a generally consistent association between the HALP score and mortality, except for age, where a significant interaction was observed (p = 0.003), indicating a stronger protective effect in older patients. Machine learning analyses supported the robustness and predictive value of the HALP score, with a maximum AUC of 0.7804.
The HALP score is significantly associated with all-cause mortality among critically ill individuals suffering from AMI.
Limited data are available regarding the prevalence of sleep-disordered breathing (SDB), particularly Cheyne–Stokes respiration (CSR), in patients with atrial fibrillation (AF) and left ventricular (LV) systolic dysfunction. Thus, this study aimed to investigate the prevalence of SDB and CSR, as well as the factors associated with these conditions, in patients with AF without LV systolic dysfunction.
Patients with paroxysmal and non-paroxysmal AF underwent echocardiography and cardiorespiratory polygraphy. Multiple linear regression analysis was performed using the apnea–hypopnea index (AHI) and %CSR as the dependent variables.
A total of 462 patients were enrolled; 335 patients (72.5%) were diagnosed with SDB (AHI ≥5/h), with a median AHI of 10.3 events per hour (interquartile range, 4.7–20.8). CSR was observed in 107 patients (23.2%). Multiple linear regression analysis showed that age, sex, body mass index, and hypertension were independently correlated with AHI (p = 0.0188, 0.0002, <0.0001, and 0.0457, respectively). Conversely, age, diabetes mellitus (DM), and the plasma N-terminal prohormone of brain natriuretic peptide (NT-proBNP) level were independently correlated with %CSR (p < 0.0001, 0.0047, and 0.0095, respectively).
SDB and CSR were common in patients with AF. CSR was observed in older patients with DM and high NT-proBNP levels.