Ventricular septal defect (VSD) with atrial septal defect (ASD) is a common complex congenital heart disease. This study aimed to evaluate the clinical efficacy and safety of transesophageal echocardiography (TEE)-guided percardiac or combined percutaneous techniques for treating VSD with ASD in patients with varying anatomies.
This retrospective cohort study reviewed 40 cases of VSD with ASD treated in our center from June 2015 to July 2023. Under TEE guidance, peratrial, perventricular, or combined percardiac/percutaneous approaches were used based on the VSD type and secundum-type ASD. Follow-up examinations, including electrocardiography, transthoracic echocardiography, and X-ray, were performed after surgery at 24 hours, 1, 3, 6, and 12 months, and yearly.
All patients underwent surgery successfully (100%), with 24, 5, and 11 patients undergoing simultaneous closure via the peratrial, perventricular, and combined percardiac/percutaneous approaches, respectively. Among them, there were six cases of a mild residual shunt, three instances of a mild tricuspid regurgitation, two cases of a mild aortic valve regurgitation, one case of a mild mitral regurgitation, and three cases of an incomplete right bundle branch block, all observed after VSD closure; all had resolved within 6 months of the operation. The chi-square test showed no significant differences in adverse event rates among the three surgical approaches (χ2 = 0.09, df = 2, p = 0.957). The Friedman test compared the preoperative and postoperative left ventricular end-diastolic diameter for the three approaches, providing p < 0.001, p = 0.589, and p = 0.445, respectively. None of the patients required reoperation during the follow-up period.
Under TEE guidance, using diverse percardiac or combined percutaneous device closure techniques for the one-stop treatment of different types of VSDs combined with ASD is safe, effective, and feasible. These approaches can be performed as a valuable alternative therapy for selected patients.
Our study evaluated the prognostic significance of white blood cells (WBC) count and WBC subsets in relation to the risk of mortality in acute aortic dissection (AAD) patients during their hospital stay.
We included 833 patients with AAD in this retrospective study. The primary outcome was in-hospital mortality. Cox regression analysis was employed to determine the independent risk factors for mortality in patients with AAD. Amidst the low- and high-WBC groups, we use Kaplan‒Meier survival analysis to compare the cumulative survival rates of patients with AAD.
Within 342 patients with type A AAD, patients belonging to the high-WBC group exhibited a notably higher mortality rate compared to patients in the low-WBC group. Kaplan-Meier analysis exhibited that the patients in high-WBC patients had a significantly higher mortality rate. Multivariable Cox regression analysis demonstrated that an elevated WBC was an independent impact factor of in-hospital mortality of patients with type A AAD (hazard ratio, 2.01; 95% confidence interval (CI): 1.24 to 3.27; p = 0.005). Corresponding outcomes were witnessed in 491 patients with type B AAD.
An elevated WBC count was strongly correlated with an elevated risk of mortality in hospitalized patients afflicted with either type A or type B AAD.
Implantable cardioverter defibrillators (ICDs) have significantly reduced the incidence of sudden cardiac death in patients with heart failure, particularly those with ischemic heart disease. However, the impact on overall mortality remains controversial, especially in non-ischemic heart failure patients. The Danish Study to Assess the Efficacy of ICDs in Patients with Non-Ischemic Systolic Heart Failure (DANISH) trial and subsequent studies have questioned the efficacy of ICDs in this population, particularly among older patients. The present study aimed to evaluate survival outcomes and predictors in a Croatian cohort of patients with an ICD or cardiac resynchronization therapy defibrillator (CRT-D) device.
This retrospective cohort study analyzed data from 614 patients who received an ICD or CRT-D device at KBC Zagreb between 2009 and 2018. Patient data, including demographic information, device indication, and clinical parameters, were collected at the time of implantation. Follow-up data were systematically recorded to assess device activation and survival outcomes. Statistical analyses included a detailed descriptive analysis, Kaplan–Meier survival estimates, and Cox regression models.
The cohort consisted predominantly of males (83.4%), with a mean age of 58.7 years. Most had reduced left ventricular ejection fraction (mean 31.4%) and were classified as New York Heart Association (NYHA) class II or III. Over a median follow-up of 48.4 months, 36.6% of patients died. Device activation occurred in 30.3% of patients, with appropriate activation observed in 88.2% of these cases. Cox regression identified age, non-sustained ventricular tachycardia (NSVT), and decompensation history as significant survival predictors.
This study confirmed that appropriate device activation improved survival in patients with an ICD/CRT-D. Age, NSVT, and history of decompensation were key predictors of device activation and survival outcomes. These findings underscore the need for individualized patient assessment when considering inserting ICDs, particularly in non-ischemic heart failure patients. Further research is needed to refine clinical guidelines and optimize patient selection for ICD therapy.
Coronary collateral circulation (CCC) is a crucial protective mechanism in acute myocardial infarction. This study aimed to identify early predictors of CCC in patients with acute ST-segment elevation myocardial infarction (STEMI) and develop a nomogram for predicting its presence.
We conducted a retrospective study of STEMI patients admitted to the Beijing Friendship Hospital from January 2015 to December 2023. Patients with CCC, as confirmed by coronary angiography, were matched 1:3 with those without CCC based on the date of admission. We compared baseline characteristics, laboratory parameters, coronary features, and in-hospital outcomes between the two groups. Variable selection was performed using least absolute shrinkage and selection operator (LASSO) regression analysis, followed by univariable and multivariable logistic regression analyses to identify independent predictors of CCC. A nomogram was constructed based on significant predictors and was validated through receiver operating characteristic (ROC) curve analysis, calibration curves, and decision curve analysis.
A total of 668 patients with STEMI were included in the study (501 without CCC and 167 with CCC). Patients with CCC had a higher prevalence of right coronary artery (RCA) closure and multi-vessel disease, as well as elevated inflammatory markers and altered coagulation parameters. Multivariable logistic regression analysis identified a history of coronary heart disease (CHD), osmolality, levels of fibrinogen, and left anterior descending (LAD) artery closure, left circumflex (LCX) artery closure, and RCA closures, as well as the Gensini score, were independent predictors of CCC. The nomogram incorporating these predictors demonstrated good discrimination and calibration, indicating an accurate prediction of the presence of CCC.
History of CHD, osmolality, levels of fibrinogen, LAD, LCX, and RCA closures, as well as the Gensini score, are independent predictors of CCC in patients with STEMI. The developed nomogram offers a clinically useful tool for identifying patients likely to have CCC, potentially aiding in personalized treatment strategies.
Previous research has highlighted a connection between gut microbiota derivatives and atherosclerosis. This study assesses the association between gut microbiota derivatives and coronary artery disease (CAD) to enhance CAD prevention and treatment strategies.
Patients presenting with suspected CAD were categorized into CAD and non-CAD groups. A propensity score matching analysis was performed to exclude confounding factors. Key differences in general characteristics and gut microbiota derivatives between these groups were also assessed. Additionally, the study explored the correlation between significant differences in the Gensini score and coronary flow reserve. Moreover, the potential of significant indicators to predict the diagnosis of coronary artery disease was analyzed.
After propensity score matching, the concentrations of interleukin-6 (IL-6) (47.23 ± 7.45 vs. 39.56 ± 7.37; p < 0.001), lipopolysaccharide (LPS) (12.79 ± 2.07 vs. 11.71 ± 1.88; p = 0.031), high-sensitivity C-reactive protein (hs-CRP) (13.58 ± 2.62 vs. 11.57 ± 2.49; p = 0.002), phenylacetyl glutamine (PAGIn) (619.20 ± 119.33 vs. 555.64 ± 109.29; p = 0.029), and trimethylamine-N-oxide (TMAO) (13.01 ± 2.19 vs. 11.70 ± 1.78; p = 0.011) in the CAD group were significantly elevated compared to those in the non-CAD group. Conversely, the serum levels of glucagon-like peptide-1 (GLP-1) (7.74 ± 2.07 vs. 9.06 ± 2.11; p = 0.012) were notably lower in the CAD group than in the non-CAD group. A positive association was observed between the serum concentrations of IL-6 (r = 0.410; p < 0.001), hs-CRP (r = 0.317; p < 0.007), TMAO (r = 0.311; p < 0.008), and coronary Gensini score. Moreover, IL-6 (b = 1.769, 95% confidence interval (CI): 0.256–3.282; p = 0.023) and TMAO (b = 10.735, 95% CI: 4.883–16.588; p < 0.001) had a direct positive impact on the coronary Gensini score. The highest diagnostic value for CAD was observed when the IL-6 cut-off value was 45.17 (sensitivity 69.6%, specificity 73.1%, area under curve 0.770; 95% CI: 0.662–0.879; p < 0.001). Meanwhile, the highest diagnostic value for CAD was noted when the TMAO cut-off value was 12.44 (sensitivity 65.2%, specificity 76.9%, the area under the curve 0.689; 95% CI: 0.564–0.814; p = 0.008). Serum TMAO was negatively correlated with coronary flow reserve (CFR) in CAD patients (r = –0.593; p = 0.009).
These findings suggest that serum IL-6, LPS, hs-CRP, PAGIn, TMAO, and GLP-1 levels can be used as clinical markers for predicting CAD severity. Among these, IL-6, hs-CRP, and TMAO are identified as independent risk factors influencing the severity of CAD—elevated levels of IL-6 and TMAO exhibit predictive utility for CAD diagnosis. Furthermore, serum TMAO is a potential clinical marker for forecasting a CAD prognosis.
Blood glucose and serum albumin can be biomarkers at admission since they are easily accessible and demonstrate correlations with cardiovascular diseases. The predictive ability of the admission blood glucose to albumin ratio (AAR) for long-term prognosis in patients with acute coronary syndrome (ACS) and its potential to elevate the predictive value of the Global Registry of Acute Coronary Events (GRACE) risk score in ACS patients post-percutaneous coronary intervention (PCI) remains unknown. Hence, this study aimed to investigate the incremental prognostic value of the AAR in patients with ACS undergoing PCI.
A rigorous development-validation approach was implemented to optimize the GRACE risk score, utilizing the AAR parameter in 1498 patients suffering from ACS after PCI at the Third People’s Hospital of Chengdu, Sichuan, China.
Over a median of 31.25 (27.53, 35.10) months, the incidence of major adverse cardiac events (MACEs), defined as a composite outcome encompassing all-cause death, cardiac death, nonfatal myocardial infarction, nonfatal stroke, and unplanned repeat revascularization, was higher in individuals with higher AARs. Thus, the AAR was an independent predictor of long-term prognosis in ACS patients undergoing PCI (HR, 1.145; 95% CI: 1.045–1.255; p = 0.004). The integration of the AAR score with the GRACE risk score increased the C statistic from 0.717 (95% CI: 0.694–0.740) to 0.733 (95% CI: 0.690–0.776) (p < 0.01).
The AAR is an independent predictor of prognosis in ACS patients and significantly increased the predictive value of the GRACE risk score.
This study aimed to investigate the conceivable utility of the aspartate aminotransferase to platelet ratio index (APRI) in prognostic prediction for patients with cardiogenic shock (CS) hospitalized in the intensive care unit (ICU).
Data for patients diagnosed with CS were obtained from the Medical Information Mart for Intensive Care-IV (MIMIC-IV) database and categorized into groups based on the APRI quartiles. The primary endpoint encompassed in-hospital and ICU mortality rates. The secondary outcomes included sepsis and acute kidney injury (AKI). Kaplan–Meier survival analysis was utilized to assess differences in main endpoints among groups categorized by their APRI.
This study collected data from 1808 patients diagnosed with CS. Multivariate Cox regression analysis indicated that an elevated APRI was independently correlated with a heightened risk of in-hospital mortality (hazard ratio (HR) 1.005 [95% confidence interval (CI) 1.003–1.007]; p < 0.001) and ICU mortality (HR 1.005 [95% CI 1.003–1.007]; p < 0.001). Multivariate logistic regression analysis demonstrated that APRI was independently correlated with a heightened risk of sepsis (odds ratio (OR) 1.106 [95% CI 1.070–1.144]; p < 0.001) and AKI (OR 1.054 [95% CI 1.035–1.073]; p < 0.001).
An increased APRI was linked to worse clinical outcomes in critically ill patients with cirrhosis. Nevertheless, further extensive prospective investigations are needed to validate these findings.
Left atrial appendage closure (LAAC) has been reported to be a viable alternative to prevent thromboembolic events for atrial fibrillation (AF) patients. Interatrial communication closure, such as atrial septal defect (ASD) and patent foramen ovale (PFO) closure could significantly decrease the occurrence of stroke. For AF patients with interatrial communication, the success rate as well as the long-term outcomes of ‘One stop’ closure remain elusive.
Studies were systematically screened using online databases (including PubMed, Cochrane Library, Web of Science, Embase, China National Knowledge Infrastructure (CNKI) database, and WanFang database) from their establishment to 1st August 2024. We utilized a fixed-effect model to synthesize the success rate and the long-term outcomes. Subgroup analysis was performed to identify the potential confounders.
A total of 7 studies comprising 156 patients were included. ASD/PFO closure combined with LAAC showed a high degree of feasibility, with a success rate of 1.00 (95% CI: 0.99, 1.00; p < 0.001). Meanwhile, ‘One stop’ ASD/PFO closure combined with LAAC exhibited a high long-term safety and a low occurrence of complications. Moreover, subgroup analysis revealed that the bleeding event occurrence was relatively higher in the male proportion ≥50% subgroup and HAS-BLED score ≥3 subgroup, respectively.
ASD/PFO closure combined with LAAC has a satisfying performance on AF patients with interatrial communication.
CRD42023462221, https://www.crd.york.ac.uk/prospero/display_record.php?ID=CRD42023462221.
Vulnerable or high-risk coronary plaques are usually referred to as angiographically mild to moderate lesions characterized by a large plaque burden, positive vessel remodeling, thin fibrous cap, and large necrotic/lipid core. According to several pathology studies, these plaques represent the substrate of coronary thrombosis in about two-thirds of cases; therefore, there has been increasing interest in detecting and treating vulnerable plaques (VPs). Nowadays, VP detection is possible through noninvasive and invasive imaging techniques, such as coronary computed tomography, magnetic resonance imaging, intravascular ultrasound, optical coherence tomography, and near-infrared spectroscopy. Since VPs were shown to be associated with cardiovascular events in observational studies, pharmacological and non-pharmacological strategies have been investigated to achieve a regression and/or a passivation of these plaques. In addition to pharmacological therapies, mainly focused on lipid-lowering agents, there has been a recent growing interest in interventional therapies, including coronary scaffolds, stents, and drug-coated balloons. This led to the concept of preventive percutaneous coronary intervention, which, unlike the treatment of culprit lesions in acute coronary syndromes or of ischemia-inducing stenoses, as recommended by guidelines, implies the treatment of angiographically and functionally non-significant lesions based on one or more high-risk plaque characteristics as identified by noninvasive or intracoronary imaging. This article provides an updated review of key concepts in defining and detecting VPs; their prognostic value and available pharmacological and interventional management evidence will also be discussed.
As technology advances, surgical approaches for atrial fibrillation have diversified. Surgical treatments include Cox-Maze surgery, left atrial appendage occlusion, or closure using a clip. Cox-Maze surgery removes excessive cardiac electrical conduction pathways, ensures electrical signals propagate exclusively through the predetermined maze channel and restores normal heart rhythm. Left atrial appendage closure reduces the risk of long-term disability or death caused by left atrial appendage thromboembolism in patients with atrial fibrillation. These devices are constantly being refined, including bipolar radiofrequency clamps (monopolar or bipolar radiofrequency), left atrial appendage closure devices (external excision using staplers, internal ligation with biomatrix patch occlusion, external device placement with the AtriClip and Endoloop ligature). In addition to surgical interventions, surgical biomaterial materials with biocompatibility and electrical conductivity have emerged in the basic research phase of atrial fibrillation treatment. This review delineates the primary surgical techniques, emphasizing their safety and efficacy in treating atrial fibrillation. An introduction to commonly used surgical equipment is provided as a reference for the clinical management of atrial fibrillation.
Stroke remains a significant, potentially life-threatening complication following transcatheter aortic valve implantation (TAVI). Moreover, the rate of strokes, particularly disabling strokes, has not diminished over time despite improvements in pre-procedural planning and implantation techniques. The mechanisms of stroke in TAVI patients are complex, and identifying consistent risk factors is challenging due to evolving patient profiles, varied study cohorts, and continuous device modifications. Multiple pharmacological and mechanical treatment strategies have been developed to mitigate the risk of stroke, particularly as TAVI expands toward younger populations. This review article discusses the pertinent factors in the evolution of stroke post-TAVI, appraises the latest evidence and techniques designed to reduce the risk of stroke, and highlights future strategies and technologies to address this unmet need.
Helicobacter pylori (H. pylori) infection and atrial fibrillation (AF) are prevalent global health concerns that significantly impact societal and economic well-being. This study explored the potential associations between H. pylori infection and the incidence and progression of AF. Emerging research suggests that H. pylori may influence AF through various pathways, including systemic inflammation, metabolic disturbances, immune responses, and changes in the gut microbiota. These pathways provide a novel perspective on the etiology of AF, suggesting that chronic H. pylori infection could exacerbate or even initiate the arrhythmic events typical of AF. Current evidence, while preliminary, points to significant correlations, particularly through changes in markers such as C-reactive protein (CRP) and lipid metabolism, which are heightened in individuals with active H. pylori infection. However, the exact mechanisms and causal nature of this relationship remain elusive, with studies showing conflicting results. This inconsistency underscores the need for more comprehensive and rigorously designed clinical and experimental research to elucidate fully the interactions between H. pylori infection and AF. Understanding these connections is crucial for developing innovative treatments and management strategies targeting microbial influences in AF patients. Future research should focus on defining the role of H. pylori eradication in the clinical management of AF assessing its impact on disease progression and patient outcomes.
This research focuses on the unresolved question of how low muscle mass influences the likelihood of atrial fibrillation (AF) recurrence after ablation treatment. Despite the growing body of evidence highlighting the importance of muscle mass in cardiovascular health, the specific impact of low muscle mass on the recurrence of AF following ablation has yet to be well-established. Thus, this study evaluated the relationship between a low computed tomography (CT)-based skeletal muscle index (SMI) of muscle sites at the fourth thoracic level (T4-SMI) and AF recurrence post-radiofrequency ablation. Furthermore, this study aimed to determine whether the T4-SMI is a predictive marker for AF recurrence.
This study included 641 patients with AF who underwent radiofrequency ablation. T4 muscle sites were determined using SliceOmatic software. Height- and body mass index (BMI)-corrected SMIs were calculated.
The lowest quartile in the T4-SMI group was defined for each sex as the “low SMI” group. The height-adjusted T4-SMI thresholds were 69.7 cm2/m2 for males and 55.91 cm2/m2 for females. The BMI-adjusted thresholds were 8.10 cm2/kg/m2 for males and 5.78 cm2/kg/m2 for females. After potential confounder adjustment, low T4-SMI was associated with a higher risk of AF recurrence. The correlation between T4-SMI (height) and AF recurrence was fully validated by constructing multiple models, and adjusting for different covariates barely altered the results. Fully adjusted models suggested that compared with the fourth T4-SMI (height) quartile, the risk odds ratio (OR) with a 95% confidence interval (CI) of the “low SMI” group was 1.57 (0.76–3.22). Finally, subgroup analysis and interaction according to gender, age, overweight/obesity, hypertension, or diabetes indicate that the differences between different layers are not significant.
Low CT-based BMI- or height-adjusted T4-SMIs were risk factors for AF recurrence post-radiofrequency ablation. A lower T4-SMI (height) significantly correlated with AF recurrence post-ablation, regardless of gender, age, or overweight/obesity. The height adjustment performed better than the BMI adjustment in that regard.
Extended aortic arch repair (EAR) is increasingly adopted for treating acute type A aortic dissection (ATAAD). However, existing prediction models may not be suitable for assessing the in-hospital death risk in ATAAD patients undergoing EAR. This study aims to develop a comprehensive risk prediction model for in-hospital death following EAR based on patient’s preoperative status and surgical data, which may contribute to identification of high-risk individuals and improve outcomes following EAR.
We reviewed clinical records of consecutive adult ATAAD patients undergoing EAR at our institute between January 2015 and December 2022. Utilizing data from 925 ATAAD patients undergoing EAR, we employed multivariable logistic regression and machine learning techniques, respectively, to develop nomograms for in-hospital mortality. Employed machine learning techniques included simple decision tree, random forest (RF), eXtreme Gradient Boosting (XGBoost), and support vector machine (SVM).
The nomogram based on SVM outperformed others, achieving a mean area under the receiver operating characteristic (ROC) curve (AUC) of 0.842 on training dataset and a mean AUC of 0.782 on testing dataset, accompanied by a Brier score of 0.058. Key risk factors included cerebral malperfusion, mesenteric malperfusion, preoperative critical station, Marfan syndrome, platelet count, D-dimer, coronary artery bypass grafting, and cardiopulmonary bypass time. A web-based application was developed for clinical use.
We develop a novel nomogram risk prediction model based on SVM algorithm for in-hospital death following extended aortic arch repair for ATAAD with good discrimination and accuracy.
Registration number ChiCTR2200066414, https://www.chictr.org.cn/showproj.html?proj=187074.
A meta-analysis was conducted to determine whether the cardiovascular mortality and lipid-lowering effects of alirocumab and evolocumab are influenced by various baseline low-density lipoprotein cholesterol (LDL-C) levels.
We searched for literature published before June 2023. Eligible randomized controlled trials (RCTs) included adults treated with alirocumab or evolocumab and reported LDL-C changes and cardiovascular deaths. The primary endpoints were cardiovascular mortality and percent changes in LDL-C from baseline.
Forty-one RCTs were included in the meta-analysis. Evolocumab did not significantly affect the outcome of cardiovascular mortality whether the baseline data were greater than 100 mg/dL or less than 100 mg/dL. However, the stratified result showed that alirocumab decreased the risk of cardiovascular mortality in patients with a baseline LDL-C level of ≥100 mg/dL (relative risk (RR) 0.45; 95% CI: 0.22 to 0.92; p = 0.03). In terms of lipid-lowering efficacy, alirocumab (mean difference (MD) –56.62%; 95% CI: –60.70% to –52.54%; p < 0.001) and evolocumab (MD –68.10%; 95% CI: –74.85% to –61.36%; p < 0.001) yielded the highest percentage reduction in LDL-C level when baseline levels were 70–100 mg/dL, while the smallest reduction in alirocumab (MD –37.26%; 95% CI: –44.06% to –30.46%; p < 0.001) and evolocumab (MD –37.55%; 95% CI: –40.47% to –34.63%; p < 0.001) occurred with baseline LDL-C levels of ≥160 mg/dL.
Alirocumab and evolocumab presented a better lipid-lowering effect when the baseline LDL-C levels were <100 mg/dL. Alirocumab was associated with a significant reduction in cardiovascular mortality at baseline LDL-C levels of ≥100 mg/dL. This finding can have significant implications for the development of personalized drug therapy.
CRD42023446723, https://www.crd.york.ac.uk/PROSPERO/view/CRD42023446723.
Chronic total occlusion (CTO) is a complex and difficult type of coronary lesion for which elective secondary intervention after subintimal plaque modification (SPM) can improve the success rate. This study sought to determine the most appropriate timing for secondary interval interventions to maximize the benefit to the patient.
This study retrospectively included patients who failed their first CTO percutaneous coronary intervention (PCI) at Beijing Anzhen Hospital Department of Cardiology from January 2019 to December 2022. We reviewed the clinical characteristics, procedural features, and outcomes of patients who underwent SPM and returned to our institution for a second CTO-PCI.
Of the 2847 patients who visited our institution between January 2019 and December 2022, 528 underwent SPM and returned to our institution on an elective basis for a secondary procedure. Of these, 236 procedures were performed within 30 days (Group I), and 292 were performed between 30 and 90 days (Group II). After the intervention, the occluded segment was successfully opened in 170 (72.0%) Group I and 248 (84.9%) Group II participants. When analyzing the factors for operational failure, we found that different intervals, diabetes mellitus, hyperlipidemia, and a history of previous PCI or percutaneous coronary angioplasty (PTCA) were the reasons for the secondary intervention failure. When analyzing the safety of the procedure, we found that pericardial effusion was the most common complication after the procedure, with an incidence of 7.4%. There was no notable variation in the incidence of pericardial effusion between the two groups, 8.9% vs. 6.2% (p = 0.232).
Higher success rates were observed when secondary procedures were performed between 30 and 90 days instead of within 30 days after the initial CTO-PCI SPM, with no significant difference in safety noted between the two groups.
The association between stroke history and clinical events after valve replacement in patients with atrial fibrillation (AF) combined with valvular heart disease (VHD) is unclear. Thus, we sought to investigate the relationship between stroke history and clinical events in patients with AF after valve replacement.
This retrospective cohort study enrolled 746 patients with AF who underwent valve replacement between January 2018 and December 2019 at the Wuhan Asia Heart Hospital. Patient information was collected from the hospital’s electronic medical record system. Patients were categorized based on their stroke history and followed through outpatient visits or by telephone until the occurrence of an endpoint event; the maximum follow-up period was 24 months. Endpoint events included thrombotic events, bleeding, and all-cause mortality. The frequency of thrombotic, hemorrhagic, and fatal events during the follow-up period was compared between the two groups. Independent risk factors for endpoint events were analyzed using multifactorial Cox regression.
The analysis included 746 patients. Over a 24-month follow-up period, there were more total adverse events (hazard ratio (HR) = 2.08, 95% confidence interval (CI) 1.06–4.08, p = 0.018), thrombotic events (HR = 10.28, 95% CI 2.85–37.11, p < 0.001), and increased all-cause mortality (HR = 5.74, 95% CI 1.84–17.93, p < 0.001) in the stroke history group than in the non-stroke history group. Fewer bleeding events were observed in the group with a history of stroke (HR = 0.87, 95% CI 0.37–2.04, p = 0.757). A multifactorial Cox regression analysis revealed that a personal history of stroke was an independent risk factor for total adverse events, thrombotic events, and all-cause mortality.
Previous stroke history is significantly associated with adverse events in AF patients following valve replacement.
Diuretic resistance (DR) is characterized by insufficient fluid and sodium excretion enhancement despite maximum loop diuretic doses, indicating a phenotype of refractory heart failure (HF). Recently, metabolomics has emerged as a crucial tool for diagnosing and understanding the pathogenesis of various diseases. This study aimed to differentiate diuretic-resistant patients from non-resistant HF to identify biomarkers linked to the emergence of DR.
Serum samples from HF patients, both with and without DR, were subjected to non-targeted metabolomic analysis using liquid chromatography-tandem mass spectrometry. Metabolite variations between groups were identified using principal component analysis and orthogonal partial least-square discriminant analysis. Metabolic pathways were assessed through the Kyoto Encyclopedia of Genes and Genomes database enrichment analysis, and potential biomarkers were determined using receiver operating characteristic curves (ROCs).
In total, 192 metabolites exhibited significant differences across the two sample groups. Among these, up-regulation was observed in 164 metabolites, while 28 metabolites were down-regulated. A total of 28 pathways involving neuroactive ligand-receptor interaction and amino acid biosynthesis were affected. The top five metabolites identified by ROC analysis as potential DR biomarkers were hydroxykynurenine, perillic acid, adrenic acid, 5-acetamidovalerate, and adipic acid.
Significant differences in metabolite profiles were observed between the diuretic-resistant and non-diuretic-resistant groups among patients with HF. The top five differentially expressed endogenous metabolites were hydroxykynurenine, perillic acid, adrenic acid, 5-acetamidovalerate, and adipic acid. The metabolic primary pathways implicated in DR were noted as amino acid, energy, and nucleotide metabolism.
This study was registered with the China Clinical Trials Registry (https://www.chictr.org.cn/hvshowproject.html?id=197183&v=1.7, ChiCTR2100053587).
This research assesses how fine particulate matter (PM2.5) pollution influences cardiovascular diseases (CVDs) globally.
Utilizing data from the 2021 Global Burden of Disease (GBD) study, we assessed the impact of PM2.5 pollution on CVDs in individuals aged 25 and older. The health burden was quantified using measures such as disability-adjusted life years (DALYs), age-standardized rates (ASRs), and the effective annual percentage change (EAPC). Joinpoint regression models were used to describe the temporal trends of CVD burdens, while the Bayesian age–period–cohort (BAPC) models were employed to project the CVD burdens through 2030. Frontier analysis was conducted to identify potential areas for improvement and gaps between the development statuses of different countries. Decomposition analysis was applied to assess the impact of population growth, aging, and epidemiological changes on the burden of CVDs.
Despite a decline in ASRs for both sexes, males continued to bear a disproportionate burden of CVDs. While substantial reductions in ASRs have been noted in Western Europe and High-income North America, smaller decreases in the EAPC have been seen in South Asia, Oceania, and Western Sub-Saharan Africa; however, Oceania faces the highest mortality burden. An inverse relationship between the sociodemographic index (SDI) and ASRs is evident nationally. Meanwhile, Afghanistan and Egypt reported elevated ASRs, and Iceland recorded the lowest rate. Projections suggest a potential reversal in ASRs by 2021. A decomposition analysis revealed that intracerebral hemorrhage poses the greatest burden in middle SDI regions, while ischemic heart disease is notably burdensome in high SDI and high-middle SDI regions.
This study highlights the disproportionate burden of CVDs associated with PM2.5 pollution, particularly in males and lower SDI regions, with significant regional disparities and projections indicating potential reversals in trends.
Rheumatoid arthritis (RA) is a chronic, systemic autoimmune disease characterized by progressive joint deformity and increased mortality. RA patients typically exhibit elevated plasma levels of inflammatory markers, contributing to endothelial dysfunction and increased arterial wall stiffness—a recognized marker of subclinical atherosclerosis and heightened cardiovascular risk. This study aimed to evaluate carotid arterial wall stiffness in RA patients using ultrasound (US) imaging modality with speckle tracking carotid strain (STCS) software, a non-invasive method for assessing subclinical cardiovascular disease indicators.
This analytical case–control study was conducted at Aydin Adnan Menderes University Hospital Department of Radiology and Department of Rheumatology. Patients who met the inclusion criteria were enrolled in the study. Data collection tools included an 11-item case report form developed by the researcher based on relevant literature and carotid US examinations performed.
The study included 143 participants: 75 RA patients (60 female and 15 male) and 68 control subjects (54 female and 14 male). The mean age was 50.9 ± 11.4 years (range: 25.0–74.0) for the RA group and 53.1 ± 12.6 years (range: 20.0–77.0) for the control group. Systolic blood pressure (SBP) and C-reactive protein (CRP) levels (mean ± SD) were 7.4 ± 11.5 in the RA group and 8.6 ± 22.2 in the control group. However, due to a few outliers in the control group, the median CRP was 3.3 mg/L (range: 2.0–71.9) in the RA group versus 2.0 mg/L (range: 0.8–145.0) in the controls. This nonparametric comparison showed significantly higher typical CRP levels in the RA group (p < 0.05). All stiffness and strain parameters in axial and longitudinal planes showed statistically significant differences between the two groups (p < 0.05), except the circumferential strain parameter “displacement (DP)” (p = 0.074). Although no significant correlation was found between the disease activity score (DAS) and any strain or stiffness parameter, the carotid intima-media thickness (CIMT) exhibited a significant positive correlation with disease duration (p = 0.001). After adjusting for confounding factors (age, gender, body mass index (BMI), and smoking status) using multivariate linear regression analysis, RA remained a significant predictor for all stiffness and strain parameters, except for the circumferential strain parameter DP.
Applying functional parameters to assess arterial wall stiffness and tension levels provides valuable insights for early detection of cardiovascular disease risk, preceding classical US findings such as increased intima-media thickness (IMT) and plaque formation. While preliminary, our findings from STCS measurements in RA patients show promise in evaluating cardiovascular disease risk in this population and potentially improving long-term outcomes through timely interventions.
The high prevalence of sarcopenia among hypertensive adults is a global health issue. Growing literature demonstrates that a high antioxidant diet can protect against sarcopenia. However, little attention has been paid to the association between the dietary composite antioxidant intake and sarcopenia in hypertension. To investigate the potential efficacy of the composite dietary antioxidant index (CDAI) on sarcopenia among hypertensive adults.
This study included 6995 hypertensive adults from the National Health and Nutrition Examination Survey (NHANES) 2001–2006 and 2013–2018, with 3212 (45.92%) females and 3783 (54.08%) males. Appendicular lean mass (ALM) and sarcopenia were assessed by dual-energy X-ray absorptiometry (DEXA). All hypertensive adults participating in NHANES were eligible to participate in dietary interviews, and the average intake of six antioxidants over two days was used to calculate the CDAI. Logistic regression was conducted to determine odds ratios (ORs) and 95% confidence intervals (CIs). Subgroup analyses and restricted cubic spline (RCS) regressions were additionally utilized.
The mean age was 48.47 ± 0.27 years old, and 1059 (15.14%) were considered to have sarcopenia. The highest quartile had a 61% decreased risk of sarcopenia (OR = 0.39, 95% CI: 0.25, 0.60) compared with the lowest quartile of CDAI. RCS revealed a linear association between CDAI with sarcopenia and ALM. Subgroup analyses demonstrated a more pronounced inverse correlation between CDAI and sarcopenia in females.
In summary, our results indicated a reverse correlation between CDAI and sarcopenia in hypertension. These findings highlighted the beneficial role of an antioxidant-rich diet in prevention and provided a valid method for managing sarcopenia in hypertensive adults.
Hypertensive disorders in pregnancy (HDP) are associated with adverse pregnancy outcomes. Three-dimensional (3D) echocardiography provides greater accuracy for assessing cardiac geometry and function during pregnancy. The aim was to assess the impact of the 3D left ventricle (LV) systolic function in HDP on pregnancy outcomes.
The prospective cohort study included primiparous with singleton pregnancies, without previous comorbidities who underwent medical history assessment, laboratory tests, ambulatory blood pressure monitoring (ABPM), and transthoracic echocardiography at baseline and six weeks after delivery. Participants were divided into a HDP group and a control group. Pregnancy outcomes (intrauterine growth restriction (IUGR), preterm delivery, and birth weight) were recorded and analyzed.
The study involved 174 HDPs and 64 controls, with a median gestational age of 34 weeks (31; 36). Compared to controls HDP exhibited significantly impaired values in both two-dimensional (2D) and 3D parameters for the systolic and diastolic function of the LV. They had higher LV mass index values and lower absolute values for 2D global longitudinal strain and 3D LV strain in all directions (p < 0.001). Multivariable regression analysis revealed that body mass index (BMI) with odds ratio (OR) of 0.751 (95% confidence interval (CI): 0.666–0.847, p < 0.001) and 3D LV global area strain (GAS) with OR of 0.234 (95% CI: 0.155–0.352, p < 0.001) were the strongest predictors of IUGR, while BMI with OR of 0.832 (95% CI: 0.758–0.914), nighttime systolic blood pressure (SBP) with OR of 1.055 (95% CI: 1.032–1.079, p < 0.01) and 3D LV ejection fraction (EF) with OR of 0.780 (95% CI: 0.687–0.885) were the strongest predictors of preterm delivery. The receiver operating characteristic (ROC) curve showed that the model with BMI and 3D LV GAS can be a good predictor for IUGR with an area under the curve (AUC) 0.951 (0.925–0.976) with 89.5% sensitivity and 86.4% specificity, p < 0.001, while the model with BMI, nighttime SBP and 3D LV EF is a predictor for preterm delivery with AUC of 0.835 (0.776–0.893) with 79.1% sensitivity and 73.7% specificity, p < 0.001. Person correlation showed a significant positive correlation between birth weight and 3D GAS, r = 0.485; p < 0.001.
LV GAS is significantly associated with IUGR and birth weight, while 3D LV EF strongly predicts preterm delivery.
Previous research has suggested that metformin may inhibit the dilation of an abdominal aortic aneurysm (AAA); however, these findings are controversial. Additionally, limited reporting exists on the relationships between metformin and thoracic aortic aneurysm (TAA) and aortic dissection (AD). Therefore, this study aimed to assess the potential relationship between metformin and the risk of aortic aneurysm (AA)/AD using the Mendelian randomization (MR) analysis.
Genome-wide association studies and FinnGen summary data were utilized for the MR analysis. The causal relationship between metformin and AA/AD was primarily assessed using the inverse-variance weighted (IVW) method. Sensitivity analyses were conducted to detect heterogeneity and pleiotropy.
The results indicated a negative correlation between metformin treatment and the risk of both AA and AD, with odds ratios(ORs) reported as follows: OR = 0.010, 95% confidence interval (CI):0.000–0.212, p = 0.003 for AA, OR = 0.004, 95% CI: 0.000–0.220, p = 0.007 for abdominal aortic aneurysm (AAA); OR = 0.017, 95% CI: 0.000–0.815, p = 0.039 for thoracic aortic aneurysm (TAA); and OR = 0.001, 95% CI: 0.000–0.531, p = 0.032 for AD using the IVW method. These findings suggested that metformin might act as a protective factor against the occurrence of AA/AD. Furthermore, sensitivity analyses validated the robustness of these findings.
This MR analysis identified a potential genetic causal relationship between metformin use and the risks of AA/AD, suggesting that metformin could serve as a protective agent in decreasing the incidences of these conditions.
The existence of internodal tracts (ITs) is controversial. Indeed, ITs in the cardiac conduction system (CCS), connected to the sinoatrial node (SAN), transmit electrical signals quickly to the left atrium and the atrioventricular node (AVN). Interestingly, research has suggested that the ITs and the tail of the SAN may share developmental homology. Additionally, many studies indicate that ITs blockage can lead to atrial conduction block and is associated with atrial fibrillation (AF). However, few studies have been reported on the morphogenesis, development, and function of ITs. Therefore, this paper aims to review the morphogenesis, development, and function of ITs, focusing on the regulatory mechanisms of transcription factors (TFs), such as NK2 homeobox 5 (NKX2.5), SHOX homeobox 2 (SHOX2), hyperpolarization activated cyclic nucleotide gated potassium channel 4 (HCN4), and T-box transcription factor 3 (TBX3) in the development and morphogenesis of ITs. This review also explores the causes of arrhythmias, especially atrial block, in order to provide new insights into the pathogenesis of CCS disorders.
Prior studies have established the safety and efficacy of conduction system pacing (CSP) in improving echocardiographic parameters and clinical outcomes. This meta-analysis aimed to investigate whether CSP could reduce the occurrence of new-onset atrial fibrillation (AF) in comparison to traditional right ventricular pacing (RVP) therapy.
A literature search was performed in PubMed, Embase, and the Cochrane Library to identify relevant clinical studies comparing CSP with RVP from January 2000 to June 2024. The study outcome was new-onset AF after pacemaker implantation. Estimated risk ratios (RR), odds ratio (OR) with 95% confidence intervals (CI) were evaluated.
Our analysis included 8 observational studies comprising a total of 2033 patients. The results indicated that 20% (406/2033) of study patients experienced new-onset AF, and CSP was associated with a significantly lower risk of new-onset AF when compared with RVP (RR: 0.44, 95% CI: 0.36–0.54, p < 0.00001, I2 = 11%; OR: 0.34, 95% CI: 0.27–0.44, p < 0.0001, I2 = 0). In the subgroup analysis, patients with atrioventricular block (AVB) tended to benefit more from CSP than those with sinus node dysfunction (SND) or AVB (p = 0.06 for RR; p = 0.12 for OR). Publication bias was observed and confirmed by the Egger's test (p = 0.0125 for RR and 0.0345 for OR). Trim and fill analysis was performed, and the overall summary effect size (RR: 0.51, 95% CI: 0.40–0.64; OR: 0.40, 95% CI: 0.31–0.52) remained significant after adjusting for publication bias.
CSP could reduce the occurrence of new-onset AF compared with RVP, and this benefit appeared to be more pronounced in patients with AVB than those with SND or AVB. However, large scale randomized controlled trials are needed to validate our findings.
Registration number: CRD42024569052; registration date: July 25, 2024; https://www.crd.york.ac.uk/PROSPERO/view/CRD42024569052.
An estimated 1.28 billion individuals in the global population suffer from hypertension. Importantly, uncontrolled hypertension is strongly linked to various cardiovascular and cerebrovascular diseases. The role of the renin-angiotensin system (RAS) is widely acknowledged in the development and progression of hypertension. This system comprises angiotensinogen, the renin/(pro)renin/(pro)renin receptor (PRR) axis, the renin/angiotensin-converting enzyme/angiotensin (Ang) II/Ang II type I receptor (AT1R) axis, the renin/angiotensin-converting enzyme (ACE) 2/Ang (1-7)/Mas receptor (MasR) axis, the alamandine/Mas-related G protein-coupled D (MrgD) receptor axis, and the renin/ACE/Ang II/Ang II type II receptor (AT2R) axis. Additionally, brain Ang III plays a vital role in regulating central blood pressure. The current overview presents the latest research findings on the mechanisms through which novel anti-hypertensive medications target the RAS. These include zilebesiran (targeting angiotensinogen), PRO20 (targeting the renin/(pro)renin/PRR axis), sacubitril/valsartan (targeting the renin/ACE/Ang II/AT1R axis), GSK2586881, Ang (1-7) and AVE0991 (targeting the renin/ACE2/Ang (1-7)/MasR axis), alamandine (targeting the alamandine/MrgD receptor axis), C21 and β-Pro7-Ang III (targeting the renin/ACE/Ang II/AT2R axis), EC33, and firibastat and NI956 (targeting brain Ang III).
Left ventricular assist devices (LVADs) have changed the landscape for patients with advanced heart failure (HF). With advances in pump design and management, patients with LVADs are living longer with improved quality of life despite having more comorbidities and complex structural heart disease. As such, HF cardiologists and surgeons collaborate more frequently with structural heart interventionalists to address the complex problems of patients with LVADs who present at different points of failure in their circuits. Unlike heart transplants and total artificial heart recipients, the native heart and its components must function to maintain successful circulatory support from these assist devices. Multiple points of potential failure of the native heart and the LVAD circuit exist that can result in significant morbidity and mortality. These include regurgitant valve lesions, interatrial shunts, outflow cannula obstruction, and pump thrombosis. Transcatheter interventions can be applied and tailored specifically to the anatomy of the individual in these situations to improve the lives and outcomes of our LVAD patients. This review provides a comprehensive approach for diagnosing and treating structural heart disease associated with patients who have LVADs, focusing on multidisciplinary collaboration and individualized interventional strategies.
To assess the precision of artificial intelligence (AI) in aiding the diagnostic process of congenital heart disease (CHD).
PubMed, Embase, Cochrane, and Web of Science databases were searched for clinical studies published in English up to March 2024. Studies using AI-assisted ultrasound for diagnosing CHD were included. To evaluate the quality of the studies included in the analysis, the Quality Assessment Tool for Diagnostic Accuracy Studies-2 scale was employed. The overall accuracy of AI-assisted imaging in the diagnosis of CHD was determined using Stata15.0 software. Subgroup analyses were conducted based on region and model architecture.
The analysis encompassed a total of 7 studies, yielding 19 datasets. The combined sensitivity was 0.93 (95% confidence interval (CI): 0.88–0.96), and the specificity was 0.93 (95% CI: 0.88–0.96). The positive likelihood ratio was calculated as 13.0 (95% CI: 7.7–21.9), and the negative likelihood ratio was 0.08 (95% CI: 0.04–0.13). The diagnostic odds ratio was 171 (95% CI: 62–472). The summary receiver operating characteristic (SROC) curve analysis revealed an area under the curve of 0.98 (95% CI: 0.96–0.99). Subgroup analysis found that the ResNet and DenNet architecture models had better diagnostic performance than other models.
AI demonstrates considerable value in aiding the diagnostic process of CHD. However, further prospective studies are required to establish its utility in real-world clinical practice.
CRD42024540525, https://www.crd.york.ac.uk/prospero/display_record.php?RecordID=540525.
Coronary artery calcification (CAC) is a robust independent predictor of cardiovascular events. Therefore, it is essential to elucidate the factors that influence CAC progression to enhance the outcomes of patients diagnosed with acute coronary syndrome (ACS). This study aimed to investigate the relationship between prevalent laboratory parameters and the calcification of coronary artery plaques in patients diagnosed with ACS by applying optical coherence tomography (OCT).
This single-center, cross-sectional study retrospectively evaluated patients with ACS who underwent percutaneous coronary intervention and OCT examinations at the Hebei General Hospital. Baseline data, laboratory parameters, and OCT imaging were analyzed. Comprehensive statistical analyses were conducted to elucidate the relationship between prevalent laboratory parameters and coronary artery plaque calcification.
In this study involving 130 patients, the platelet to lymphocyte ratio (PLR) demonstrated a significant positive correlation with coronary artery plaque calcification (rs = 0.373, p < 0.001), whereas albumin exhibited a significant negative correlation (rs = –0.585, p < 0.001). Both the PLR (odds ratios (OR) 1.011, 95% CI 1.002–1.019, p = 0.014) and albumin levels (OR 0.642, 95% CI 0.539–0.764, p < 0.001) emerged as significant independent predictors of plaque calcification. Receiver operating characteristic curve analysis identified a cutoff point for albumin at <40.65, yielding a sensitivity of 75.8% and a specificity of 77.9%, Comparatively, a PLR >145.04 demonstrated a sensitivity of 61.3% and a specificity of 76.5% for predicting plaque calcification.
Albumin and the PLR were significantly associated with plaque calcification in patients with ACS, serving as independent predictors of coronary artery plaque calcification. These parameters may significantly contribute to risk stratification and the future development of preventive strategies to mitigate adverse cardiovascular events.
Aortic stenosis (AS) is a significant and growing concern, with a prevalence of 2–3% in individuals aged over 65 years. Moreover, with an aging global population, the prevalence is anticipated to double by 2050. Indeed, AS can arise from various etiologies, including calcific trileaflets, congenital valve abnormalities (e.g., bicuspid and unicuspid valves), and post-rheumatic, whereby each has a distinct influence that shapes the onset and progression of the disease. The normal aortic valve has a trilaminar structure comprising the fibrosa, spongiosa, and ventricularis, which work together to maintain its function. In calcific AS, the disease begins with early calcification starting in high mechanical stress areas of the valve and progresses slowly over decades, eventually leading to extensive calcification resulting in impaired valve function. This process involves mechanisms similar to atherosclerosis, including lipid deposition, chronic inflammation, and mineralization. The progression of calcific AS is strongly associated with aging, with additional risk factors including male gender, smoking, dyslipidemia, and metabolic syndrome exacerbating the condition. Conversely, congenital forms of AS, such as bicuspid and unicuspid aortic valves, result in an earlier disease onset, typically 10–20 years earlier than that observed in patients with a normal tricuspid aortic valve. Rheumatic AS, although less common in developed countries due to effective antibiotic treatments, also exhibits age-related characteristics, with an earlier onset in individuals who experienced rheumatic fever in their youth. The only curative therapies currently available are surgical and transcatheter aortic valve replacement (TAVR). However, these options are sometimes too invasive for older patients; thus, management of AS, particularly in older patients, requires a comprehensive approach that considers age, disease severity, comorbidities, frailty, and each patient’s individual needs. Although the valves used in TAVR demonstrate promising midterm durability, long-term data are still required, especially when used in younger individuals, usually with low surgical risk. Moreover, understanding the causes and mechanisms of structural valve deterioration is crucial for appropriate treatment selections, including valve selection and pharmacological therapy, since this knowledge is essential for optimizing the lifelong management of AS.
The left atrial appendage occlusion (LAAO) procedure is an important intervention for stroke prevention in patients with non-valvular atrial fibrillation who cannot tolerate anticoagulation. Accurate imaging is essential to guide and ensure optimal device deployment. Transesophageal echocardiography (TEE) has traditionally been the gold standard for procedural guidance, but intracardiac echocardiography (ICE) is emerging as an alternative owing to its unique advantages. This review examines the comparative effectiveness, procedural advantages, limitations, and clinical outcomes of ICE and TEE in LAAO closure, highlighting emerging trends and implications for future clinical practice.
The concurrent presence of iron deficiency (ID) and heart failure (HF) can worsen prognosis and reduce the quality of life for affected individuals. This study aimed to explore the effects of incorporating iron sucrose into standard HF treatments for patients with acute decompensated HF and ID.
We prospectively enrolled 65 hospitalized HF patients, all with a left ventricular ejection fraction of ≤40% and ID, defined as ferritin levels below 100 ng/mL or ferritin levels between 100 and 299 ng/mL with transferrin saturation below 20%. Patients were randomized into two groups: the iron sucrose group, who received intravenous iron sucrose in addition to the standard HF treatment; a control group who received standard HF treatment alone serum ferritin, iron, transferrin saturation, and Kansas City Cardiomyopathy Questionnaire (KCCQ) scores were measured at baseline and a 4-week follow-up.
Baseline characteristics, iron profiles, and KCCQ scores were comparable between the two groups. At 4 weeks, patients in the iron sucrose group possessed significantly higher serum ferritin levels than those in the control group (ferritin 485.3 ± 269.7 ng/mL vs. 225.5 ± 162.5 ng/mL, p < 0.001; Δferritin 382.2 ± 243.5 ng/mL vs. 97.4 ± 143.0 ng/mL, p < 0.001, respectively). Only 9.1% of patients in the iron sucrose group remained within the ID criteria, compared to 36.7% in the control group (p = 0.012). The ΔKCCQ score was 10.6 points higher (27.8 ± 19.5 vs. 17.1 ± 17.8 points, p = 0.031) in the iron sucrose group than in the control group.
Post-discharge intravenous iron sucrose may improve iron levels and quality of life in HF patients with ID.
NCT06703411, https://clinicaltrials.gov/expert-search?term=NCT06703411.
Myocardial diseases such as myocarditis and cardiomyopathies are clinically important and can cause complications such as heart failure and arrhythmias, which increase the risk of death. The combination of myocarditis with cardiomyopathy is difficult to diagnose because their manifestations often overlap, and multiple myocardial diseases are usually not included in the diagnostic search. Hypertrophic cardiomyopathy (HCM) is the most common cardiomyopathy; however, few studies have examined the combination of myocarditis and HCM, thereby highlighting the importance of this problem. This article aimed to analyze the influence of myocarditis on clinical features and outcomes in patients with HCM.
A literature search was performed using PubMed and the Scientific Electronic Library eLIBRARY.ru databases. Relevant studies, published until November 2023, were analyzed in detail. Studies were selected in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) standards.
Twelve studies (three original articles and nine clinical cases) were isolated from a total cohort of 1504 publications and were included in the study. The prevalence of myocarditis in HCM ranged from 23.5% to 46.7%. The presence of concomitant myocarditis in patients with HCM was associated with heart failure progression, worsening of ventricular arrhythmias, and an increased risk of sudden cardiac death.
The incidence of myocarditis in HCM is high. Early detection and treatment of myocarditis in patients with HCM can slow the progression of heart failure rhythm disturbances and improve the disease prognosis.
The systematic review was registered in the International Prospective Register of Systematic Reviews PROSPERO (CRD42024499672, https://www.crd.york.ac.uk/PROSPERO/view/CRD42024499672).
Congenital long QT syndrome (LQTS) is a potentially life-threatening hereditary arrhythmia characterized by a prolonged QT interval on electrocardiogram (ECG) due to delayed ventricular repolarization. This condition predisposes individuals to severe arrhythmic events, including ventricular tachycardia and sudden cardiac death. Traditional approaches to LQTS research and treatment are limited by an incomplete understanding of its gene-specific pathophysiology, variable clinical presentation, and the challenges associated with developing effective, personalized therapies. Recent advances in human induced pluripotent stem cell (iPSC) technology have opened new avenues for elucidating LQTS mechanisms and testing therapeutic strategies. By generating cardiomyocytes from patient-specific iPSCs (iPSC-CMs), it is now possible to recreate the patient’s genetic context and study LQTS in a controlled environment. This comprehensive review describes how iPSC technology deepens our understanding of LQTS and accelerates the development of tailored treatments, as well as ongoing challenges such as incomplete cell maturation and cellular heterogeneity.
This systematic review and meta-analysis aimed to evaluate the predictive effect of Growth Differentiation Factor-15 (GDF-15) on adverse outcomes in patients undergoing cardiovascular interventions.
A comprehensive literature search was performed across PubMed, EMBASE, Cochrane Library, and Web of Science databases. The meta-analysis used hazard ratios (HR) and odds ratios (OR) to compare outcomes such as all-cause mortality, cardiovascular death, postoperative atrial fibrillation (AF), acute kidney injury (AKI), and spontaneous myocardial infarction (MI) between high GDF-15 levels and control groups. Subgroup analyses were conducted based on study design and GDF-15 cutoff levels. Publication bias was evaluated using funnel plot and Egger’s test.
A total of 13 studies were included in the meta-analysis. The study revealed a significant association between elevated GDF-15 levels and increased all-cause mortality. Subgroup analysis showed a significant association in retrospective studies but not in prospective studies. Higher GDF-15 cutoff levels (>2 ng/mL) were more strongly associated with increased mortality than lower cutoff levels (≤2 ng/mL). Elevated GDF-15 levels were found to be significantly associated with increased risks of cardiovascular death, AKI, and spontaneous MI. No significant difference was observed in the incidence of postoperative AF. The overall adverse outcomes analysis showed no significant difference. Subgroup analyses suggested significant associations primarily observed in studies with higher GDF-15 cutoffs.
Elevated GDF-15 levels are associated with increased risks of all-cause mortality, cardiovascular death, AKI, and spontaneous MI in patients undergoing cardiovascular interventions. Due to the heterogeneity of the studies, including variations in surgical techniques, the conclusions should be interpreted with caution.
CRD42024582279, https://www.crd.york.ac.uk/PROSPERO/view/CRD42024582279.
Cardiac arrest (CA) is associated with high incidence and mortality rates. Hence, assessing the prognosis of CA patients is crucial for optimizing clinical treatment. This study aimed to develop and validate a clinically applicable nomogram for predicting the risk of in-hospital mortality in CA patients.
We retrospectively collected the clinical data of CA patients admitted to two hospitals in Zhejiang Province between January 2018 and June 2024. These patients were randomly assigned to the training set (70%) and the internal validation set (30%). Variables of interest included demographics, comorbidities, CA-related characteristics, vital signs, and laboratory results, and the outcome was defined as in-hospital death. Variables were selected using least absolute shrinkage and selection operator (LASSO) regression, recursive feature elimination (RFE), and eXtremely Gradient Boosting (XGBoost). Meanwhile, multivariate regression analysis was used to identify independent risk factors. Subsequently, prediction models were developed in the training set and validated in the internal validation set. Receiver operating characteristic (ROC) curves were plotted and the area under these curves (AUC) was calculated to compare the discriminative ability of the models. The model with the highest performance was further validated in an independent external cohort and was subsequently represented as a nomogram for predicting the risk of in-hospital mortality in CA patients.
This study included 996 CA patients, with an in-hospital mortality rate of 49.9% (497/996). The LASSO regression model significantly outperformed the RFE and XGBoost models in predicting in-hospital mortality, with an AUC value of 0.81 (0.78, 0.84) in the training set and 0.85 (0.80, 0.89) in the internal validation set. The AUC values for these sets in the RFE model were 0.74 (0.70, 0.78) and 0.77 (0.72, 0.83), respectively, and those for the XGBoost model were 0.75 (0.71, 0.79) and 0.77 (0.72, 0.83), respectively. For the optimal prediction model, the AUC value of the LASSO regression model in the external validation set was 0.84 (0.78, 0.90). The LASSO regression model was represented as a nomogram incorporating several independent risk factors, namely age, hypertension, cause of arrest, initial heart rhythm, vasoactive drugs, continuous renal replacement therapy (CRRT), temperature, blood urea-nitrogen (BUN), lactate, and Sequential Organ Failure Assessment (SOFA) scores. Calibration and decision curves confirmed the predictive accuracy and clinical utility of the model.
We developed a nomogram to predict the risk of in-hospital mortality in CA patients, using variables selected via LASSO regression. This nomogram demonstrated strong discriminative ability and clinical practicality.
Acute ischemic mitral regurgitation is a rare but potentially catastrophic complication following acute myocardial infarction (AMI), characterized by severe clinical presentation and high mortality. Meanwhile, advancements in primary percutaneous coronary intervention (PCI) have reduced the incidence of acute mitral regurgitation (AMR). The surgical approach remains the standard treatment but is associated with high rates of complications and in-hospital mortality, particularly in patients with cardiogenic shock or mechanical complications, such as papillary muscle rupture. Mitral transcatheter edge-to-edge repair (M-TEER) has emerged as a minimally invasive treatment. Current evidence demonstrates the feasibility and safety of M-TEER in reducing mitral regurgitation, stabilizing hemodynamics, and improving in-hospital and short-term survival. The procedural success rate is high, with notable symptoms and functional status improvements. Mortality rates remain significant, reflecting the severity of AMR, but are lower compared to medical management alone. Challenges remain regarding the optimal timing of M-TEER, long-term device durability, and patient selection criteria. Ongoing iterations in device technology and procedural techniques are expected to enhance outcomes. This review highlights the role of M-TEER in AMR management, emphasizing the need for multidisciplinary decision-making and further research to refine M-TEER application and improve outcomes in this high-risk AMR population.
Electrophysiology (EP) procedures, including cardiac implantable electronic devices (CIEDs) and ablations, are widely used to manage arrhythmias and heart failure. These interventions, though effective, require substantial resources, prompting the need for systematic economic evaluations to inform healthcare decision-making.
A systematic review of studies from 2007 to 2024 was conducted in two phases. Phase one assessed trends in economic evaluations of EP procedures, analyzing 129 studies across regions and timeframes. Phase two focused on cost-effectiveness analyses of implantable cardioverter defibrillators (ICDs), cardiac resynchronization therapy defibrillators (CRT-Ds), and atrial fibrillation (AF) ablation, examining outcomes like quality-adjusted life years (QALYs) and incremental cost-effectiveness ratios (ICERs), while identifying factors influencing economic results.
EP procedures generally demonstrated favorable cost-effectiveness, particularly in high-income regions. Studies on ICDs and CRT-Ds consistently supported their economic value for patients with arrhythmias or heart failure, while AF ablation showed potential for long-term benefits, particularly when compared to medical therapies. However, results varied by region, reflecting differences in healthcare systems, costs, and patient populations.
The review highlights the overall cost-effectiveness of EP procedures in many settings but underscores the need for tailored economic evaluations in low- and middle-income countries. Simplified methodologies and greater attention to regional contexts are recommended to guide resource allocation and policy development globally.
This study aimed to determine the impact of central adjudication of site-reported events in patients with acute coronary syndromes treated with ticagrelor or clopidogrel in addition to aspirin within the frame of indication-seeking The PLATelet Inhibition and Clinical Outcomes (PLATO) trial. Adjudication in randomized outcome-driven trials is supposed to maintain integrity by applying uniform rules for the quality assessment of clinical events. Some preliminary data suggest an imbalance between central and site diagnoses in PLATO. We gained access to the Food and Drug Administration (FDA)-issued adjudication dataset and analyzed the evidence.
Death, myocardial infarction (MI), stroke/ transient ischemic attack (TIA), bleeding, arterial thrombotic events, and cardiac ischemic events underwent central adjudication. We assessed geography, timing, impact of disagreements, and primary endpoint composition.
Among 18,624 trial enrollees, 10,704 central adjudications occurred across 7171 patients in 43 countries. There were 938 deaths, 2751 cases of MI, 359 strokes/TIAs, 2680 cardiac events, 130 thrombotic events, and 3782 bleeding events. The match occurred for 5451 events, while mismatches favoring clopidogrel (n = 2535) or ticagrelor (n = 2706) (p = 0.79) were common for major (n = 1797), moderate (n = 942), or minor (n = 735) disagreements. The central decision prevailed in 2945 cases. There was a significant (HR = 0.84; 95% confidence intervals (CI): 0.75–0.95; p = 0.004) adjudication delay in the 2007–2008 events but finalized in 2009. Ticagrelor was significantly less favored in 2009 than in 2007–2008 (HR = 1.19; 95% CI: 1.05–1.34; p = 0.005). There was a remarkably consistent match for bleeding adjudication (HR = 1.02; 95% CI: 0.83–1.25; p = 0.859) between treatment arms. The primary endpoint in the PLATO trial exhibited highly significant disagreement favoring ticagrelor for vascular death (HR = 2.02; 95% CI: 1.1–3.64; p = 0.019); MI (HR = 2.31; 95% CI: 2.79–43.94; p = 0.034); stroke (HR = 1.37; 95% CI: 2.66–63.28; p = 0.036); total events (HR = 2.51; 95% CI: 1.86–3.39; p = 0.01).
Central adjudication in the PLATO trial was delayed and impacted the primary endpoint by inflating the ticagrelor benefit, resulting in drug approval. The regulatory authorities should consider independent audits when unblinding is suspected in the indication-seeking clinical trials.
Cardiovascular disease (CVD) remains the foremost cause of morbidity and mortality worldwide. Recent advancements in machine learning (ML) have demonstrated substantial potential in augmenting risk stratification for primary prevention, surpassing conventional statistical models in predictive performance. Thus, integrating ML with Electronic Health Records (EHRs) enables refined risk estimation by leveraging the granularity and breadth of longitudinal individual patient data. However, fundamental barriers persist, including limited generalizability, challenges in interpretability, and the absence of rigorous external validation, all of which impede widespread clinical deployment.
This review adheres to the methodological rigor of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) and Scale for the Assessment of Narrative Review Articles (SANRA) guidelines. A systematic literature search was performed in March 2024, encompassing the Medline and Embase databases, to identify studies published since 2010. Supplementary references were retrieved from the Institute for Scientific Information (ISI) Web of Science, and manual searches were curated. The selection process, conducted via Rayyan, focused on systematic and narrative reviews evaluating ML-driven models for long-term CVD risk prediction within primary prevention contexts utilizing EHR data. Studies investigating short-term prognostication, highly specific comorbid cohorts, or conventional models devoid of ML components were excluded.
Following an exhaustive screening of 1757 records, 22 studies met the inclusion criteria. Of these, 10 were systematic reviews (four incorporating meta-analyses), while 12 constituted narrative reviews, with the majority published post-2020. The synthesis underscores the superiority of ML in modeling intricate EHR-derived risk factors, facilitating precision-driven cardiovascular risk assessment. Nonetheless, salient challenges endure heterogeneity in CVD outcome definitions, undermine comparability, data incompleteness and inconsistency compromise model robustness, and a dearth of external validation constrains clinical translatability. Moreover, ethical and regulatory considerations, including algorithmic opacity, equity in predictive performance, and the absence of standardized evaluation frameworks, pose formidable obstacles to seamless integration into clinical workflows.
Despite the transformative potential of ML-based CVD risk prediction, it remains encumbered by methodological, technical, and regulatory impediments that hinder its full-scale adoption into real-world healthcare settings. This review underscores the imperative circumstances for standardized validation protocols, stringent regulatory oversight, and interdisciplinary collaboration to bridge the translational divide. Our findings established an integrative framework for developing, validating, and applying ML-based CVD risk prediction algorithms, addressing both clinical and technical dimensions. To further advance this field, we propose a standardized, transparent, and regulated EHR platform that facilitates fair model evaluation, reproducibility, and clinical translation by providing a high-quality, representative dataset with structured governance and benchmarking mechanisms. Meanwhile, future endeavors must prioritize enhancing model transparency, mitigating biases, and ensuring adaptability to heterogeneous clinical populations, fostering equitable and evidence-based implementation of ML-driven predictive analytics in cardiovascular medicine.