In the era of drug-eluting stents (DESs), few studies have explored the association between arterial stiffness and the risk of in-stent restenosis (ISR).
Pulse pressure and pulse pressure index (PPI), which are noninvasive measures of arterial stiffness, were measured before percutaneous coronary interventions (PCI). PPI is the ratio of pulse pressure to systolic blood pressure. ISR was defined based on the angiographic evidence of ≥50% stenosis within the previously stented segment. Logistic regression was used to calculate the odds ratios (ORs) and 95% confidence intervals (CIs) for ISR.
A total of 644 patients were collected, including 72 patients in the ISR group. Pulse pressure and PPI were significantly higher in the ISR group (ISR vs no ISR: pulse pressure, 58.5 ± 16.3 vs 53.1 ± 13.7 mmHg [p = 0.002]; PPI, 0.43 ± 0.07 vs 0.40 ± 0.07 [p = 0.001]). Multivariable-adjusted ORs for ISR, for tertile3 vs. tertile1, were 2.73 (95% CI, 1.33–5.62; p = 0.006) and 2.12 (95% CI, 1.04–4.31; p = 0.038) for pulse pressure and PPI, respectively. The ORs for ISR with a 1-standard deviation (SD) increase in pulse pressure and PPI were 1.41 (95% CI, 1.09–1.83; p = 0.010) and 1.52 (95% CI, 1.15–2.01; p = 0.003), respectively.
Arterial stiffness denoted by high pulse pressure and PPI is a predictive factor for ISR. A pre-PCI wide pulse pressure could potentially serve as a marker of risk, as well as a potential target for future therapies.
ChiCTR2000039901, https://www.chictr.org.cn/showproj.html?proj=51063.
The temporal trend and disparities in cardiovascular disease (CVD) mortality risk among long-term survivors of different Hodgkin lymphoma (HL) types are unclear. Therefore, we aimed to examine the temporal trend and disparities in CVD mortality risk among survivors of various HL subtypes.
This multicenter cohort included 20,423 patients with HL diagnosed between 1975 and 2018, with an average follow-up time of 18.5 years. Proportional mortality ratio, cumulative cause-specific mortality accounting for competing risks, standardized mortality ratio, and absolute excess risk were calculated.
Patients with nodular lymphocyte-predominant HL (NLPHL) and classical HL exhibited higher CVD-related deaths than HL-related deaths after approximately 12 and 120 months of follow-up, respectively. From the initial diagnosis to >500 months of follow-up, the cumulative CVD mortality increased continuously without a plateau and exceeded that of HL at different times in most patients with various HL types. However, CVD mortality risk exceeded that of HL earlier in NLPHL than in other types. Black or male patients with nodular sclerosing classical HL exhibited a higher CVD mortality risk, while a contrary trend was noted among those with lymphocyte-rich classical HL or lymphocyte-depleted classical HL. Over the past decades, CVD mortality risk has decreased slowly or remained unchanged. Patients with HL exhibited higher risks of CVD mortality than the general population.
CVD mortality risk exceeded that of HL over time among many survivors. This temporal trend was significantly different among various HL subtypes. Thus, more effective strategies are required to reduce the risk of CVD mortality, depending on subtypes.
The aim of the study was to systematically evaluate and compare the efficacy of everolimus-eluting stents (EESs) and paclitaxel-coated balloons (PCBs) in treating patients with in-stent restenosis (ISR).
We performed a comprehensive search of the PubMed, Cochrane Library, Web of Science, and Embase databases up to August 2024. Two researchers independently conducted literature retrieval, screening, data inclusion, and quality assessment. A collaborative meta-analysis was performed using Stata 17.0.
A total of ten randomized controlled trials (RCTs) were included, all assessed using the Cochrane quality assessment tool and were categorized as having a low risk of bias. The analysis revealed a significantly higher need for target lesion revascularization in the PCB group compared to the EES group (odds ratio (OR) = 2.74, 95% confidence interval (CI) (1.80–4.16), p < 0.001, I2 = 38.6%). There were no significant differences between the EES or PCB treated ISR patients in terms of all-cause mortality, cardiac death, myocardial infarction, target lesion revascularization, and stent thrombosis within one year. Subgroup analyses based on ISR causative factors showed consistent results with overall findings and significantly reduced heterogeneity.
PCBs are associated with a higher frequency of target lesion revascularization compared to EES in the treatment of ISR. However, there are no significant differences in other outcome indicators. Therefore, EES is recommended as the preferred treatment for ISR in clinical decision-making.
INPLASY202480079, https://inplasy.com/inplasy-2024-8-0079/.
Invasive hemodynamic monitoring provides essential information for managing acute heart failure (AHF) and cardiogenic shock (CS) patients, aiding circulatory shock phenotyping and in individualized and hemodynamically-based therapeutic management. The hemodynamic trajectory after the initial care bundle has been provided refines prognostication and anticipates hospital outcomes. Invasive hemodynamic monitoring also tracks the clinical response to supportive measures, providing objective background for therapeutic escalation/de-escalation, facilitating titration of vasoactive/temporary mechanical circulatory support (tMCS) to achieve an optimal balance between native heart function and device assistance, and allowing for a repeated reassessment of hemodynamics during the support weaning phase. Therefore, complete hemodynamic assessment (i.e., arterial line, central venous catheter, and pulmonary artery catheter) is recommended for any patient in overt CS; however, we also provide some pragmatic clinical, imaging, and laboratory criteria to identify patients with beginning stages of CS, which could also benefit from complete invasive hemodynamic assessment. The specific hemodynamic phenotypes that can be applied in clinical practice and case-based examples of how the invasive hemodynamic phenotype can change following therapeutic actions are presented to provide pragmatic guidance on invasive hemodynamic monitoring. This review also aims to summarize the available monitoring technologies, describing the current limitations of each one and the perspective for future developments in the era of artificial intelligence. The gaps in evidence that still characterize pulmonary catheter use, i.e., lack of a robust positive randomized clinical trial in CS, are discussed, along with the wide background of non-randomized studies currently supporting its use in the CS field. The reappraisal of invasive hemodynamic monitoring, closely linked to the advent and increasing adoption of tMCS, sets the stage for greater adoption of this clinical tool in the future, as it remains a fundamental tool for the intensive care cardiologist.
Ventricular septal rupture (VSR) is a life-threatening complication of myocardial infarction. While surgical repair is regarded as the definitive treatment, the optimal approach to revascularization remains uncertain. This study aims to evaluate the effects of infarct-related artery (IRA) revascularization and the completeness of revascularization on long-term survival and the incidence of major adverse cardiovascular and cerebrovascular events (MACCE) in patients with VSR.
This retrospective study analyzed 132 VSR patients who underwent surgical repair at the Fuwai Hospital from 2004 to 2022. Patients were categorized based on whether they received IRA revascularization. For those with multi-vessel disease (MVD), revascularization was classified as complete or incomplete. The primary outcome was all-cause mortality, with a mean follow-up of 77.8 months (median 71.0 months). The secondary outcome was MACCE.
Of the 132 patients, 28 did not undergo IRA revascularization. Kaplan-Meier analysis showed similar all-cause mortality and MACCE rates between patients with and without IRA revascularization. Adjusted Cox regression confirmed no significant association between IRA revascularization and long-term mortality (adjusted hazard ratio [aHR], 0.62; 95% CI: 0.22–1.79) or MACCE (aHR, 1.30; 95% CI: 0.52–3.27). These findings were consistent across both single-vessel and MVD patients. Among the 84 MVD patients, 53 underwent complete revascularization. Patients with complete revascularization had a lower incidence of MACCE (aHR, 0.26; 95% CI: 0.10–0.67) compared to those with incomplete revascularization, although no significant difference in mortality was observed (aHR, 0.57; 95% CI: 0.17–1.85).
IRA revascularization does not affect long-term survival or MACCE rates in VSR patients. However, complete revascularization significantly reduces the risk of MACCE in patients with MVD.
Myocardial infarction (MI)-related arrhythmias are an essential risk factor in sudden cardiac death. Aberrant cardiac the cardiac voltage-gated sodium channel (Nav1.5) is important in the development of ventricular arrhythmias after an MI. These mechanisms are profoundly complex and involve sodium voltage-gated channel α subunit 5 (SCN5A) and sodium voltage-gated channel α subunit 10 (SCN10A) single nucleotide polymorphisms, aberrant splicing of SCN5A mRNAs, transcriptional and post-transcriptional regulation, translation, post-translational transport, and modification, along with protein degradation. These mechanisms ultimately promote a decrease in peak sodium currents, an increase in late sodium currents, and changes in sodium channel kinetics. This review aimed to explore the specific mechanisms of Nav1.5 in post-MI arrhythmias and summarize the potential of therapeutic drugs. An in-depth study of the effect of Nav1.5 on arrhythmias after myocardial ischemia is of crucial clinical significance.
Left ventricular thrombus (LVT) is associated with major adverse cardiovascular and cerebrovascular events (MACCEs). Anticoagulation represents the current primary management for LVT; however, current studies in some Asian populations suggest that the anticoagulation benefit in LVT patients is not significant. Given the heterogeneity of clinical phenotypes in LVT patients, the population of LVT patients who benefit from anticoagulation needs to be further explored.
This study included patients diagnosed with LVT at the FuWai Hospital from 2009 to 2021. We performed a latent class analysis (LCA) based on important clinical characteristics to objectively determine the number and dimensionality of clusters. Additionally, Kaplan–Meier curves and a Cox analysis were used to explore the relationship between anticoagulation therapy and MACCEs and major bleeding events in LVT patients.
A total of 1085 patients were enrolled in this study, and during a median follow-up time of 36.5 months, 206 patients developed MACCEs, while 16 patients developed major bleeding events. Moreover, 1085 patients were categorized into four clusters following the LCA. In the adjusted model, the risk of MACCEs was significantly lower in LVT patients receiving anticoagulation in cluster 4 (hazard ratio (HR): 0.486, 95% confidence interval (CI): 0.243–0.971) than in the group not receiving anticoagulation; however, there were no differences in the other three clusters or the whole population. There was a significant interaction between anticoagulation and the clustered subgroups (p for interaction in MACCEs: 0.046). However, no significant correlation was found for major bleeding events across clusters or for anticoagulant therapy.
Our study suggests that not all LVT patients benefit from anticoagulation therapy; younger LVT patients with fewer complications and more cardiomyopathies are more likely to benefit from anticoagulation therapy.
Currently, there are limited data on the clinical outcomes of percutaneous coronary intervention (PCI) compared to coronary artery bypass grafting (CABG) for the treatment of chronic total occlusion (CTO). We compared the clinical outcomes of patients with CTO lesions treated by PCI versus CABG.
This study included 2587 patients with coronary artery disease (CAD) with CTO from January 1, 2019 to December 31, 2021. Both short- and long-term clinical outcomes were compared in patients with CTO who received successful revascularization. The primary endpoint, defined as major adverse cardiac and cerebrovascular events (MACCE), was a composite of all-cause mortality, cerebrovascular events, and myocardial infarction. Unplanned revascularization and heart failure hospitalization were defined as secondary endpoints separately. Propensity score matching was applied to balance baseline characteristics between the two groups.
The PCI group had lower MACCE (0.47% vs. 2.11%) within 30 days of the index operation, but the difference did not reach statistical significance (p = 0.06). After an average follow-up of 37.2 months, no significant differences were observed between PCI and CABG in all-cause mortality (hazard ratio [HR] = 2.29, 95% CI: 0.79–6.61; p = 0.13), MACCE (HR = 2.03, 95% CI: 0.86–4.76; p = 0.10), or heart failure hospitalization rate (sub distribution HR [SHR] = 0.98, 95% CI: 0.26–3.74; p = 0.98). However, patients who underwent PCI had a higher risk of unplanned revascularization (SHR = 10.32, 95% CI: 2.42–43.95; p = 0.002).
In patients with CAD with CTO, PCI was associated with a trend of lower short-term MACCE compared to CABG, but with a higher risk of long-term unplanned revascularization. There were no significant differences in long-term all-cause mortality, MACCE, or heart failure hospitalization rates between PCI and CABG.
Acute kidney injury (AKI) is a major complication of cardiac surgery, particularly in patients with pre-existing chronic kidney disease (CKD), who are at higher risk due to their compromised renal function. This study investigated the association between the triglyceride–glucose (TyG) index, a marker of insulin resistance, and postoperative AKI in CKD patients undergoing cardiac surgery to enhance risk stratification and perioperative management.
This retrospective study included 542 patients with impaired renal function (estimated glomerular filtration rate (eGFR) 15–60 mL/min/1.73 m2) undergoing cardiac surgery from January 2018 to December 2019. The TyG index was calculated as Ln(fasting triglycerides [mg/dL] × fasting blood glucose [mg/dL]/2), and outcomes were defined as postoperative AKI (per Kidney Disease: Improving Global Outcomes (KDIGO) criteria), in-hospital mortality, and length of hospital stay. Multivariate logistic regression and subgroup analyses assessed the association between TyG and these endpoints.
Among the 542 patients, 55.7% developed AKI, and the in-hospital mortality rate was 7.6%. In the multivariate regression analysis, the odds ratio for AKI with each unit increase in LnTyG was 0.43 (95% CI 0.02–8.70, p = 0.579), while in the standardized TyG, it was 0.96 (95% CI 0.77–1.21, p = 0.754). Subgroup analyses, stratified by age, sex, CKD stage, and diabetes status, revealed no significant associations across all strata (all p for interaction > 0.05).
The TyG index is not significantly associated with AKI or prognosis after cardiac surgery in patients with kidney dysfunction. Further studies are needed to elucidate the role of insulin resistance in the pathogenesis of AKI.
Vascular smooth muscle cells (VSMCs) are involved in atherosclerotic plaque development. The formation of VSMC-originated foam cells, phenotypic switching, and VSMC proliferation, migration, apoptosis, and autophagy play different roles in atherosclerosis (AS).
Foam cell formation promotes the generation and evolution of atherosclerotic plaques. The VSMC phenotype, switching from contractile to other forms, is important in the formation and progression of AS. VSMC proliferation, migration, and apoptosis affect the stability of atherosclerotic plaques through the fibrous cap. VSMC proliferation and migration can increase the thickness of the fibrous cap of the plaques, which protects plaques from rupture and is beneficial for slowing the occurrence of advanced lesions. However, apoptosis can accelerate plaque rupturing and trigger severe cardiovascular disease. The autophagy of VSMCs has a protective influence on safeguarding cellular homeostasis in the early stages of AS. However, increased autophagy of VSMCs in the late stages of AS can lead to cell death, thereby affecting the stability of late-stage plaques. This review comprehensively reviews recent research on genetic proteins and mechanisms influencing various aspects of VSMCs, including VSMC-derived foam cells, phenotypic switching, proliferation, migration, apoptosis, and autophagy. Additionally, this review aimed to examine the implications of VSMCs for AS and discussed several regulators that can impact the progression of this condition. Our review thoroughly summarizes the latest research developments in this field.
Based on the vital role of VSMCs in AS, this review provides an overview of the latest factors and mechanisms based on VSMC-derived foam cells, phenotype switching, proliferation, migration, apoptosis, and autophagy. The review also introduces certain regulators that can inhibit the development of AS. An understanding of the role of VSMCs aids in identifying new targets and directions for advancing innovative anti-atherosclerotic therapeutic regimens and provides new insights into the development of treatments for AS.
The aim of this study was to compare the image quality of coronary computed tomography angiography (CCTA) images obtained using contrast enhancement boost (CE-boost) technology combined with deep learning reconstruction technology at a low dose and low contrast agent flow rate/dosage with traditional CCTA images, while exploring the potential application of this technology in early screening of coronary artery disease.
From March 2024 to September 2024, 60 patients suspected of having coronary artery stenosis were enrolled in this study. Ultimately, 46 patients were included for analysis. Based on different acquisition protocols, divided into Group A and Group B. Group A underwent conventional computed tomography (CT) angiography with a tube voltage of 120 kV, a contrast agent injection rate of 6 mL/s, and a dosage of 0.9 mL/kg. Group B received a triple-low CCTA protocol with a tube voltage of 100 kV, a contrast agent injection rate of 2 mL/s, and a dosage of 0.3 mL/kg. Additionally, Group C was created by applying CE-Boost combined with a deep learning reconstruction technique to Group B images. The radiation dose and contrast agent dosage were compared between Group A and Group B. The image quality of the three groups, including CT values, background noise, signal-to-noise ratio (SNR), and contrast signal-to-noise ratio (CNR), was also compared, with p < 0.05 indicating significant statistical differences.
Our results indicate that Group A required 67.8% more contrast agent and a 52.0% higher radiation dose than Group B (64.68 ± 3.30 mL vs. 20.19 ± 2.22 mL, 6.21 (4.60, 7.78) mSv vs. 2.05 (1.42, 4.33) mSv, all p < 0.05). Image analysis revealed superior subjective scores in Groups A (4.68 ± 0.72) and C (4.38 ± 0.95) versus Group B (4.25 ± 0.10) (both p < 0.05), with no statistical difference between Groups A and C. CT values were significantly elevated in Group A across all vessels compared to both Groups B and C (p < 0.05), while Group C exceeded Group B post CE-Boost. SNR comparisons showed Group A dominance over B in the proximal right coronary artery (RCA-1)/left main coronary artery (LM)/left anterior descending coronary artery (LAD)/left circumflex coronary artery (LCX) and over C in the RCA-1/LM (p < 0.05), contrasting with the superiority of SNR in Group C versus B in the middle right coronary artery/distal right coronary artery (RCA-2/3)/LM/LAD/LCX. CNR analysis demonstrated an equivalent performance between A and C, though both groups significantly surpassed Group B (A vs. B: p < 0.05; C vs. B: p < 0.05).
The triple-low CCTA protocol using CE-Boost technology combined with deep learning reconstruction, achieved a 52% reduction in radiation exposure and a 67.8% reduction in contrast agent usage, while preserving diagnostic image quality (with CNR and noise levels comparable to standard protocols). This demonstrates its clinical feasibility for repeated coronary evaluations without compromising diagnostic accuracy.
The Working Group on Cardiopulmonary Bypass and Extracorporeal Life Support of China National Center for Cardiovascular Quality Improvement presents evidence-based guidelines on patient blood management for adult cardiac surgery under cardiopulmonary bypass. These guidelines aim to promote patient blood management programs, reduce blood loss and allogeneic transfusion, and improve patient outcomes. The Guidelines Panel includes multidisciplinary experts. Based on prior investigation and the patient, intervention, comparison, outcome (PICO) principles, thirteen questions from four aspects were selected, including priming and fluid management during cardiopulmonary bypass, anticoagulation and monitoring during cardiopulmonary bypass, peri-cardiopulmonary bypass blood product infusion, and autologous blood infusion. Systemic reviews of the thirteen questions were performed through literature research. Recommendations were generated using the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) system and reviewed and refined by all the Guideline Panel members. A total of 19 recommendations were finally approved after five expert meetings between 2023 and 2024. By implementing these recommendations, perfusionists and medical staff involved in cardiac surgery can optimize patient blood management, effectively decrease allogeneic blood transfusion rates, minimize perioperative bleeding, and ultimately improve the prognosis in adult patients undergoing cardiac surgery with cardiopulmonary bypass.
Infective endocarditis (IE) is a severe condition characterized by a predominantly bacterial infection of the heart valves or endocardial surface, often leading to significant morbidity and mortality. Anemia is very common in patients with IE, which may be explained by factors such as chronic inflammation, hemolysis, kidney disease, and pre-existing iron deficiency. This review aimed to comprehensively examine the prevalence, causes, and clinical impact of anemia in IE patients and the role of blood transfusion in managing these patients. The diagnostic approach to anemia in IE includes combining clinical assessment and laboratory investigations, specifically distinguishing between different etiologies. Blood transfusion is likewise very common in IE, especially in surgically treated patients. Thus, balancing the need to correct anemia with the risks associated with blood transfusion is complex, and robust evidence is scarce. Management strategies for anemia in IE may extend beyond transfusion, encompassing pharmacological treatments such as iron supplementation and erythropoiesis-stimulating agents. Despite advancements in understanding the interplay between anemia and IE, several knowledge gaps and unresolved questions remain, necessitating further research to refine treatment protocols and improve patient outcomes. Future directions include investigating emerging therapeutic approaches, optimizing multidisciplinary care pathways, and developing evidence-based guidelines tailored to the unique needs of IE patients. This review underscores the importance of a comprehensive, individualized approach to managing anemia and transfusion in IE, aiming to enhance clinical outcomes and quality of life for affected patients.
The narrow therapeutic range of warfarin, alongside the response of numerous influencing factors and significant inter-individual variability, presents major challenges for personalized medication. This study aimed to combine clinical and genetic characteristics with machine learning (ML) algorithms to develop and validate a model for predicting stable warfarin doses in patients from Northern China after mechanical heart valve replacement surgery.
This study included patients who underwent mechanical heart valve replacement surgery at the Beijing Anzhen Hospital between January 2021 and January 2024 and achieved a stable warfarin maintenance dose. Comprehensive clinical and genetic data were collected, and patients were divided into training and validation cohorts at an 8:2 ratio through random division. The variables were selected using analysis of covariance (ANCOVA). Algorithms for predicting the stable warfarin dose were constructed using a traditional linear model, general linear model (GLM), and 10 ML algorithms. The performance of these algorithms was evaluated and compared using R-squared (R2), mean absolute error (MAE), and ideal prediction percentage to identify the optimal algorithm for predicting the stable warfarin dose and verify its clinical significance.
A total of 413 patients were included in this study for model training and validation, and 13 important features were selected for model development. The support vector machine radial basis function (SVM Radial) algorithm showed the best performance of all models, with the highest R2 value of 0.98 and the lowest MAE of 0.14 mg/day (95% confidence interval (CI): 0.11–0.17). This model successfully predicted the ideal warfarin dose in 93.83% of patients, with the highest ideal prediction percentage found in the medium-dose group (95.92%). In addition, the model demonstrated high predictive accuracy in both the low-dose and high-dose groups, with ideal prediction percentages of 85.71% and 92.00%, respectively.
Compared to previous methods, SVM Radial demonstrates significantly higher accuracy for predicting the warfarin maintenance dose following heart valve replacement surgery, suggesting it has potential for widespread application. However, this study was based on a relatively small sample size and conducted at a single center. Future research should involve larger sample sizes and multicenter data to validate the predictive accuracy of the SVM Radial model further.
The atherogenic index of plasma (AIP) is calculated as the logarithm of the triglyceride (TG) to high-density lipoprotein cholesterol (HDL-C) ratio. While previous studies suggested that TG and HDL-C levels were linked to the prognosis in various cardiovascular conditions, including ischemic heart failure (IHF), there is limited research specifically examining AIP in the context of IHF. Therefore, our study sought to explore the association between AIP and the prognosis of IHF and to compare the predictive value of AIP, HDL-C, and TG levels for identifying patients with poor outcomes.
This retrospective cohort study was conducted at a single institution involving 2036 IHF patients with post-percutaneous coronary intervention (PCI) who were followed for 36 months. Patients were divided into four groups categorized according to AIP quartiles. The primary outcome of interest was major adverse cardiovascular events (MACEs), while secondary outcomes included all-cause mortality, non-fatal myocardial infarction (MI), and any revascularization. Kaplan–Meier survival curves were used to evaluate the occurrence of endpoints across the four groups. Multivariate Cox regression analysis reinforced that AIP independently predicted primary and secondary outcomes. Restricted cubic spline (RCS) method was employed to examine the non-linear association between AIP and endpoints. Receiver operating characteristic (ROC) curves, combined with the Delong test, were used to assess and compare the predictive accuracy of AIP, TG, and HDL-C.
The incidence of MACEs (Q4:Q1 = 50.6:23.0, p < 0.001), all-cause death (Q4:Q1 = 25.0:11.6, p < 0.001), and any revascularization (Q4:Q1 = 21.6:9.6, p < 0.001) were significantly higher in patients with elevated AIP. The Kaplan– Meier curve analysis further supported a positive association between AIP and MACEs (plog-rank < 0.001). Multivariate Cox analysis showed that AIP was independently associated with the increased risk of MACEs (Q4:Q1 (HR (95% CI)): 2.84 (2.25–3.59), ptrend < 0.001), all-cause death (Q4:Q1 (HR (95% CI)): 2.76 (1.98–3.84), ptrend < 0.001), non-fatal MI (Q4:Q1 (HR (95% CI)): 3.01 (1.32–6.90), ptrend < 0.001), and any revascularization (Q4:Q1 (HR (95% CI)): 2.92 (2.04–4.19), ptrend < 0.001). In RCS, higher AIP was non-linearly relevant to an increased risk of MACEs (pnon-linear = 0.0112). In subgroup analysis, the predictive value of AIP for MACEs was more pronounced in the younger patient subgroup (pinteraction = 0.003). The ROC curves showed the predictive value of AIP (area under curve [AUC] = 0.641), HDL-C (AUC = 0.600), and TG (AUC = 0.629), and AIP had the best predictive value among TG (AIP:TG: difference in AUC (95% CI), 0.012 (0.001–0.024), p for Delong test = 0.028) and HDL-C (AIP:HDL-C: difference in AUC (95% CI), 0.041 (0.018–0.064), p for Delong test <0.001).
In IHF patients after PCI, AIP was strongly relevant to an increased risk of MACEs and had the best predictive validity compared with TG and HDL-C.
Patients with aortic dissection (AD) exhibit an elevated early mortality rate. A timely diagnosis is essential for successful management, but this is challenging. There are limited data delineating the factors contributing to a delayed diagnosis of AD. We conducted a scoping review to assess the time to diagnosis and explore the risk factors associated with a delayed diagnosis.
This scoping review was conducted in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. We conducted online searches in PubMed, Web of Science, Cochrane Library, Bing, Wanfang Data Chinese database, and the China National Knowledge Infrastructure (CNKI) Chinese database for studies that evaluated the diagnostic time and instances of delayed diagnoses of AD.
A total of 27 studies were retrieved from our online searches and included in this scoping review. The time from symptom onset to diagnosis ranged from 40.5 min to 84.4 h, and the time from hospital presentation to diagnosis ranged from 0.5 h to 25 h. Multiple factors resulted in a significantly delayed diagnosis. Demographic and medical history predictors of delayed diagnosis included the female sex, age, North American versus European geographic location, initial AD, history of congestive heart failure, history of hyperlipidemia, distressed communities index >60, walk-in visits to the emergency department, those who transferred from a non-tertiary care hospital, and preoperative coronary angiography. Furthermore, chest and back pain, especially abrupt or radiating pain, low systolic blood pressure, pulse deficit, and malperfusion syndrome required less time for diagnostic confirmation. In contrast, painlessness, syncope, fever, pleural effusion, dyspnea, troponin positivity, and acute coronary syndrome-like electrocardiogram were more prevalent in patients with a delayed diagnosis.
A recognition of the features associated with both typical and atypical presentations of AD is useful for a rapid diagnosis. Educational efforts to improve clinician awareness of the various presentations of AD and, ultimately, improve AD recognition may be relevant, particularly in non-tertiary hospitals with low exposure to aortic emergencies.
Atrial fibrillation (AF) is a prevalent and complex arrhythmia for which the pathogenesis involves various electrophysiological factors, notably the regulation of calcium channels. This article aimed to investigate the specific roles and molecular mechanisms of the L-type and T-type calcium channels, ryanodine receptors (RyRs), inositol 1,4,5-triphosphate receptors (IP3Rs), calcium release-activated calcium (CRAC) channels, and transient receptor potential (TRP) channels in the pathogenesis and persistence of AF. In addition, this article reviews recent advances in calcium channel-targeted drugs from experimental and clinical studies, offering new insights into the relationship between calcium channel regulation and AF pathology. These findings suggest promising directions for further research into the mechanisms of AF and the development of targeted therapeutic strategies.
Rehabilitation through exercise is the core content of cardiac rehabilitation, which is conducive to promoting myocardial recovery and reducing mortality. However, the overall participation rate in exercise rehabilitation is low. Thus, this study aimed to comprehensively evaluate the barriers and facilitators of exercise rehabilitation for patients with myocardial infarction using the updated Consolidated Framework for Implementation Research (CFIR 2.0).
Systematic research retrieval was reviewed via PubMed, Embase, Web of Science, Cochrane Library, ProQuest, and PsycINFO databases. Based on CFIR 2.0, this study used descriptive analyses to analyze the research results of each included document and identify it as a barrier or facilitator.
In total, 5185 studies were obtained from a preliminary search; 11 studies were ultimately included; 5 studies were quantitative. This study summarized 50 influencing factors, including 27 barriers and 23 facilitators. Most factors were related to the individual domain (64%). The remaining factors were related to the inner setting domain (20%), innovation domain (10%), implementation process domain (4%), and outer setting domain (2%).
This study integrated the barriers and facilitators of exercise rehabilitation of patients with myocardial infarction. The study emphasizes the importance of considering the individual domain, inner setting domain, innovation domain, implementation process domain, and outer setting domain factors when implementing exercise rehabilitation. This study provides a systematic foundation for optimizing cardiac rehabilitation programs.
CRD42024521287, https://www.crd.york.ac.uk/PROSPERO/view/CRD42024521287.
Cardiomyopathy denotes a group of heart diseases caused by structural or functional heart muscle disorders, with various genetic and non-genetic etiologies. Based on the current literature, this narrative review synthesizes key findings from available information on the classification, diagnosis, and prognosis of inherited or acquired cardiomyopathies. Following a different approach to prior systematic reviews, this study does not implement any formal inclusion or exclusion criteria or structured search strategy. However, this review does consider the evidence of influential studies, prominent cardiology guidelines, and expert consensuses to provide a comprehensive overview of recent advancements in the field. Further, explication is performed for the latest advances in genetic mutations, diagnostic imaging techniques, and therapeutic techniques. All diagnoses involve clinical presentations, imaging, and laboratory tests. Future research directions include personalized therapy, quantitative imaging techniques, and new drug treatments. This review highlights cardiomyopathy research by emphasizing the integration of precision medicine, advanced imaging, and molecular diagnostics. Future research on cardiomyopathy should include precision medicine and personalized therapies with an exhaustive integration of techniques and resources to catalyze further innovations in diagnostics and therapeutic approaches. Thus, this narrative review will provide clinicians and researchers with insight into the future of cardiomyopathy management by summarizing key developments and trends.
This study aimed to construct a prediction model for a treatment plan for patients with coronary artery disease combined with diabetes mellitus using machine learning to efficiently formulate the treatment plan for special patients and improve the prognosis of patients, provide an explanation of the model based on SHapley Additive exPlanation (SHAP), explore the related risk factors, provide a reference for the clinic, and concurrently, to lay the foundation for the establishment of a multicenter prediction model for future treatment plans.
To investigate the relationship between concomitant coronary heart disease (CHD) and diabetes mellitus (DM), this study retrospectively included patients who attended the Beijing Anzhen Hospital of Capital Medical University between 2022 and 2023. The processed data were then input into five different algorithms for model construction. The performance of each model was rigorously evaluated using five specific evaluation indicators. The SHAP algorithm also provided clear explanations and visualizations of the model's predictions.
The optimal set of characteristics determined by the least absolute shrinkage and selection operator (LASSO) regression were 15 features of general information, laboratory test results, and echocardiographic findings. The best model identified was the eXtreme Gradient Boost (XGBoost) model. The interpretation of the model based on the SHAP algorithm suggests that the feature in the XGBoost model that has the greatest impact on the prediction of the results is the glycated hemoglobin level.
Using machine-learning algorithms, we built a prediction model of a treatment plan for patients with concomitant DM and CHD by integrating patients' information and screened the best feature set containing 15 features, which provides help and strategies to develop the best treatment plan for patients with concomitant DM and CHD.
This study reviews the correlation between the gut virome and cardiovascular diseases (CVDs) and investigates the potential role of the gut virome in CVD progression. The gut virome, which includes bacteriophages and eukaryotic viruses, interacts with the intestinal microbiota and the host immune system, and affects the overall health of the host. Previous studies have demonstrated that alterations in the gut virome are closely associated with various cardiovascular conditions, including hypertension, atherosclerosis, atrial fibrillation, heart failure, and viral myocarditis. Thus, the gut virome may contribute to CVD development by regulating intestinal microecology and immune responses, affecting intestinal barrier function and systemic inflammatory responses. However, despite advancements in gut virome research, our understanding of the specific mechanisms involved and therapeutic potential in CVDs remains limited. Future developments in virus databases and advancements in sequencing technology are expected to offer new insights and methods for the early diagnosis and accurate treatment of CVDs.
The high prevalence and mortality rate of combined atrial fibrillation (AF) and obstructive sleep apnea syndrome (OSAS) impose a significant disease burden on public healthcare systems. However, there is currently a lack of risk-assessment tools for all-cause mortality in patients with both AF and OSAS. Therefore, this study utilized clinical data from patients at the First Affiliated Hospital of Xinjiang Medical University to establish a predictive model and address this gap.
This study included 408 patients with AF and OSAS, randomly divided into a training set (n = 285) and a validation set (n = 123). Subsequently, the training set was split into deceased and surviving groups to analyze in-hospital indicators.
A total 10 variables were selected from an initial 64 variables in patients with AF and OSAS identified through Lasso regression screening, including hypoxemia, catheter ablation (CA), red blood cell count (RBC), lymphocyte count, basophil granulocyte count, total bile acids, D-dimer, free triiodothyronine, N-terminal pro-brain natriuretic peptide (NT-proBNP), and chronic obstructive pulmonary disease. Variables identified as significant in the univariate logistic regression analysis were included in the multivariable logistic regression analysis, which revealed that CA (odds ratio (OR) = 0.21) was an independent protective factor. In contrast, moderate-to-severe hypoxemia (OR = 11.11), RBC <3.8 × 1012/L (OR = 20.70), and D-dimer ≥280 ng/mL (OR = 7.07) were independent risk factors. Based on this, receiver operating characteristic (ROC) curves were plotted, showing area under the curve (AUC) values of 0.96 for the training set and 0.91 for the validation set, indicating the model exhibited good predictive ability. A risk-scoring system was developed to assess the overall mortality risk of patients with AF and OSAS. The percentage bar chart demonstrated an increase in mortality rate and a decrease in survival rate as the risk level increased.
The predictive model and risk scoring system developed in this study exhibit good predictive abilities in evaluating all-cause mortality in patients with AF and OSAS, providing valuable clinical guidance and reference.
Antiplatelet therapy plays a pivotal role in the management of atherosclerotic cardiovascular diseases, providing critical protection against thrombotic complications. However, the role of antiplatelet therapy in primary prevention is limited, as an elevated risk of bleeding often offsets the potential benefits. Meanwhile, long-term antiplatelet monotherapy in secondary prevention provides clear benefits for stable patients. In the setting of acute coronary syndromes, dual antiplatelet therapy, which combines aspirin with a P2Y12 inhibitor, such as clopidogrel, prasugrel, or ticagrelor, has demonstrated superior efficacy over aspirin alone, with prasugrel and ticagrelor offering more rapid and potent effects. However, the increased bleeding risk associated with more intensive regimens necessitates careful assessment of both ischemic and bleeding risks, particularly in high-risk individuals. Recent advancements in stent technology and a deeper understanding of patient-specific risk profiles have led to significant advances in tailoring antiplatelet strategies. Current guidelines emphasize individualized approaches regarding the duration and intensity of the therapy. This review examines the evolution of antiplatelet treatment strategies in heart diseases, integrating evidence from pivotal studies to highlight current practices, while addressing considerations for special populations and optimal antithrombotic regimens following structural cardiac interventions. The development of novel agents, such as targeted antithrombotic therapy, and personalized therapeutic approaches continues to shape efforts to improve both efficacy and safety. Together, these advances support a more refined, patient-centered approach to antiplatelet therapy aimed at optimizing clinical outcomes in the context of a highly dynamic and evolving therapeutic landscape.
To investigate the factors that influence blood transfusions after neonatal cardiac surgery and their association with prolonged mechanical ventilation (PMV) to provide a basis for optimizing blood transfusion strategies.
This study retrospectively analyzed the clinical data of 202 neonates who had undergone cardiac surgery with cardiopulmonary bypass (CPB) in Beijing Anzhen Hospital from 2019 to 2023. Demographic data, preoperative parameters (body weight, hemoglobin, Risk-Adjusted Classification of Congenital Heart Surgery 1 (RACHS-1) score), intraoperative data (CPB time, aortic cross-clamp time, deep hypothermic circulatory arrest (DHCA)), and transfusions of red blood cells (RBCs), fresh frozen plasma (FFP), and platelet concentrate (PC) within 48 hours after surgery were collected. PMV was defined as mechanical ventilation ≥96 hours after surgery. Multivariate logistic regression was used to analyze independent risk factors for PMV, and the dose–response relationship between transfusion volume and PMV was evaluated by restricted cubic splines (RCSs).
Within 48 hours postoperation, 50.00% of patients were transfused with RBCs, 37.62% were transfused with FFP, and 27.72% were transfused with PC. The PMV incidence was 36.63% in patients with lower body weight (odds ratio (OR) = 0.38, 95% confidence interval (CI): 0.20–0.74; p = 0.005), lower preoperative hemoglobin (OR = 0.99; 95% CI: 0.97–0.99; p = 0.041), and a RACHS-1 score of 4 (OR = 2.56; 95% CI: 1.04–6.27; p = 0.040), and RBCs (OR = 2.02; 95% CI: 1.02–4.00; p = 0.043), and FFP infusion (OR = 1.98; 95% CI: 1.02–3.85; p = 0.043) were independent risk factors. The RCS demonstrated a linear dose–response relationship between the volume of RBCs infused and PMV (p nonlinear = 0.668), whereas there was no association for FFP. The duration of intensive care unit (ICU) stay in patients with PMV (14 days vs. 8 days) and the hospitalization (18 days vs. 13 days) were significantly longer (both p < 0.001).
Blood transfusion after neonatal cardiac surgery is an important controllable risk factor for the development of PMV, and its risk increases linearly with the volume of RBC transfusion. Future multicenter prospective studies are needed to validate the causal association further.
Transcatheter aortic valve implantation (TAVI) is a minimally invasive procedure to treat severe aortic stenosis in select patients. Patients who have undergone TAVI are at high risk of infective endocarditis (IE), especially during the first year post-operation. Early diagnosis of IE is essential to initiate targeted antibiotic therapy and/or surgical intervention. However, the early detection of IE following TAVI poses significant diagnostic challenges. Current imaging techniques, including echocardiography, nuclear imaging, and magnetic resonance imaging, have varying degrees of sensitivity and specificity, each with inherent limitations. Nuclear imaging modalities, such as positron emission tomography/computed tomography using 18F-fluorodeoxyglucose (18F-FDG PET/CT) and white blood cell single photon emission computed tomography/computed tomography (WBC SPECT/CT), have shown promise in early IE detection, particularly due to the ability of these methods to identify metabolic and anatomical abnormalities. However, false-positive results related to post-operative inflammation complicate data interpretation, and limited data exist for using these methods in very early IE detection post-TAVI. Intracardiac echocardiography (ICE) offers enhanced visualization of prosthetic valve leaflets, but the invasive nature of ICE restricts its widespread use. Whole-body imaging, such as 18F-FDG PET/CT, facilitates the identification of distant lesions and systemic complications, aiding diagnosis and treatment decisions. Diagnosing IE after TAVI is especially challenging within the first 60 days post-procedure, a critical period when imaging findings may be inconclusive due to false negatives or limited availability of advanced modalities. This review underscores the diagnostic complexity of very early and early (0–60 days) IE post-TAVI, emphasizing the need for a multimodal imaging approach to overcome the limitations of individual modalities. Nonetheless, early antimicrobial therapy should be considered even without definitive imaging findings, highlighting the importance of clinical vigilance in managing this challenging condition.
Congenital long QT syndrome (LQTs) is an inherited cardiac condition resulting from cardiac repolarization abnormalities. Since the initial description of congenital LQTs by Jervell and Lange-Nielsen in 1957, our understanding of this condition has increased dramatically. A diagnosis of congenital LQTs is based on the medical history of the patient, alongside electrogram features, and a genetic variant that is identified in approximately 75% of cases. The appropriate risk stratification involves a multitude of factors, with β-blockers being the cornerstone of therapy. Recent developments, such as the incorporation of artificial intelligence (AI) for electrocardiogram (ECG) interpretation, genotype–phenotype-specific therapies, and emerging gene therapies, may potentially make personalized medicine in LQTs a reality in the near future. This review summarizes our current understanding of congenital LQTs, with a focus on risk stratification, current therapeutic interventions, and emerging developments in the management of congenital LQTs.
Pleural effusion (PE) commonly occurs in cardiac surgery patients, often requiring tube drainage. This study aimed to investigate associations between PE drainage trajectories and clinical outcomes in patients undergoing cardiac surgery.
Patients who underwent cardiac surgery and subsequent tube drainage during hospitalization in the intensive care unit, due to substantial PE, were enrolled. PE drainage volumes were recorded daily. The relationships between PE drainage and poor outcome or mortality risks were examined using logistic regression analysis. Latent class growth analysis (LCGA) was used to classify PE trajectories, and the characteristics of each latent class were compared.
In total, 386 patients were enrolled over 3 years, of whom 113 (29.3%) developed poor outcomes. These patients had significantly higher average PE drainage volumes on days 2–4 (1.7 vs. 1.2 mL/kg/day; p = 0.002) and days 5–7 (0.9 vs. 0 mL/kg/day; p < 0.001). Average PE drainage volumes during the first 2–4 and 5–7 days were associated with poor outcomes (odds ratio (OR) = 1.10 (95% confidence interval (CI): 1.02–1.20); p = 0.014 and 1.19 (95% CI: 1.08–1.32); p < 0.001, respectively). LCGA identified three distinct PE drainage trajectory classes: persistently high (Class 1, n = 39), gradually declining from high to low (Class 2, n = 128), and persistently low (Class 3, n = 219). Among these, Class 1 had the highest mortality and poor outcome risks.
A trend in PE formation demonstrated a strong correlation with mortality and poor outcomes in patients who underwent cardiac surgery. Patients with persistently high PE drainage volumes required close monitoring and attention.
Major adverse cardiovascular events (MACEs) significantly affect the prognosis of patients with myocardial infarction (MI). With the widespread application of machine learning (ML), researchers have attempted to develop models for predicting MACEs following MI. However, there remains a lack of evidence-based proof to validate their value. Thus, we conducted this study to review the ML models’ performance in predicting MACEs following MI, contributing to the evidence base for the application of clinical prediction tools.
A systematic literature search spanned four major databases (Cochrane, Embase, PubMed, Web of Science) with entries through to June 19, 2024. With the Prediction Model Risk of Bias Assessment Tool (PROBAST), the risk of bias in the included models was appraised. Subgroup analyses based on whether patients had percutaneous coronary intervention (PCI) were carried out for the analysis.
Twenty-eight studies were included for analysis, covering 59,392 patients with MI. The pooled C-index for ML models in the validation sets was 0.77 (95% CI 0.74–0.81) in predicting MACEs post MI, with a sensitivity (SEN) and specificity (SPE) of 0.78 (95% CI 0.73–0.82) and 0.85 (95% CI 0.81–0.89), respectively; the pooled C-index was 0.73 (95% CI 0.66–0.79) in the validation sets, with an SEN of 0.75 (95% CI 0.67–0.81) and an SPE of 0.84 (95% CI 0.75–0.90) in patients who underwent PCI. Logistic regression was the predominant model in the studies and demonstrated relatively high accuracy.
ML models based on clinical characteristics following MI, influence the accuracy of prediction. Therefore, future studies can include larger sample sizes and develop simplified tools for predicting MACEs.
CRD42024564550, https://www.crd.york.ac.uk/PROSPERO/view/CRD42024564550.
Aortic regurgitation is a valvular disorder that necessitates the integration of multiple aspects of clinical practice. The underlying etiologies span an array of pathologies, including congenital, infectious, structural, and traumatic causes. Imaging studies range from traditional cardiac diagnostic strategies to advanced imaging modalities. Meanwhile, depending on clinical presentation, management may involve a medical or surgical approach. Long-term surveillance and chronic disease management are crucial in preventing progression into further complications, such as heart failure. This review aims to provide a thorough, comprehensive analysis of the contemporary understanding of aortic regurgitation. The information we have included will offer a unique perspective on how recent updates in diagnostic and management strategies can be applied to provide excellent patient care. Specifically, we have attempted to focus on exploring the innovations in the invasive management of aortic regurgitation.
The risk factors for developing postoperative pediatric delirium (PD) are multifactorial and include underlying conditions, cyanosis, surgery, intensive care stay, analgesia used for sedation, and withdrawal symptoms. Disturbed cerebral autoregulation in children with congenital heart disease (CHD) can lead to hyper- and hypoperfusion states of the central nervous system and is potentially associated with poor neurological outcomes. Our study aimed to investigate whether disturbed cerebral autoregulation postoperatively is associated with the onset of PD in children with CHD.
We conducted a prospective observational study in neonates and infants undergoing corrective surgery for CHD via cardiopulmonary bypass (CPB). Cerebral regional oxygen saturation (rSO2) and mean arterial pressure (MAP) were measured within the first 24 hours after surgery in the pediatric intensive care unit (PICU). The cerebral oximetry index (COx) was calculated from these parameters using ICM+ software. A COx ≥0.4 was considered indicative of impaired autoregulation. Delirium symptoms were assessed using the Sophia Observation of Withdrawal–Pediatric Delirium (SOS-PD) score.
Cerebral autoregulation was evaluated postoperatively at the bedside of 49 neonates and infants (22 males, 44.9%, vs. 27 females, 55.1%) between January 2019 and April 2023. The median age of the patients was 134 days (interquartile range (IQR): 49.5–184 days), the median weight was 5.1 kg (IQR: 4.0–6.3 kg), and the monitoring duration was 23.0 hours (IQR: 20–24.5 hours). In total, 27/49 (55%) patients developed postoperative PD during their stay in the PICU. There was no statistically significant difference in the duration of globally impaired autoregulation between the delirious and non-delirious groups (14.5% vs. 13.9%, p = 0.416). No evidence was found supporting the effect of MAP outside the lower and upper limits of autoregulation for the onset of postoperative delirium (p = 0.145 and p = 0.904, respectively). Prolonged mechanical ventilation, longer PICU stay, and higher use of opioids and benzodiazepines were observed in the delirious group.
Our findings suggest that impairment of cerebral autoregulation cannot solely explain the higher rate of PD in children undergoing congenital cardiac surgery. Rigorous hemodynamic management may potentially minimize the impact of cerebral hypo- or hyperperfusion states during the postoperative period, preventing their harmful effects. Additional studies with a larger sample size are needed to confirm the hypothesis and current findings.
Mesenteric malperfusion (MMP) represents a severe complication of acute aortic dissection (AAD). Research on risk identification models for MMP is currently limited.
Based on a retrospective study of medical records from the Beijing Anzhen Hospital spanning from January 2016 to June 2022, we included 435 patients with AAD and allocated their data to training and testing sets at a ratio of 7:3. Key preoperative predictive variables were identified through the least absolute shrinkage and selection operator (LASSO) regression. Subsequently, six machine learning algorithms were used to develop and validate an MMP risk identification model: logistic regression (LR), support vector classification (SVC), random forest (RF), extreme gradient boosting (XGBoost), naive Bayes (NB), and multilayer perceptron (MLP). To determine the optimal model, the performance of the model was evaluated using various metrics, including the area under the receiver operating characteristic curve (AUROC), accuracy, sensitivity, specificity, and the Brier score.
LASSO regression identified white blood cell count (WBC), neutrophil count (NE), lactate dehydrogenase (LDH), serum lactate levels, and arterial blood pH as key predictive variables. Among these, the WBC (OR 1.169, 95% confidence interval [CI] 1.086, 1.258; p < 0.001) and LDH levels (OR 1.001, 95% CI 1.000, 1.003; p = 0.008) were identified as independent risk factors for MMP. Among the six assessed machine learning algorithms, the RF model exhibited the best predictive capabilities, yielding AUROCs of 0.888 (95% CI 0.887, 0.889) and 0.797 (95% CI 0.794, 0.800) in the training and testing datasets, respectively, as well as sensitivities of 0.864 (95% CI 0.862, 0.867) and 0.811 (95% CI 0.806, 0.816), respectively, in the corresponding datasets.
This study employed machine learning algorithms to develop a model capable of identifying MMP risk based on initial preoperative laboratory test results. This model can serve as a basis for making decisions in the treatment and diagnosis of MMP.
Heart failure (HF) is a complex clinical syndrome that represents one of the leading causes of morbidity and mortality in developed nations. It is well established that every HF-related hospital admission leads to worsened quality of life for the patient and their caregiver and also imposes a significant financial burden on society. Therefore, reducing hospital admissions for this population has emerged as a critical tactic over the past decades. Initial attempts at remote monitoring focused on self-reported vital signs and symptoms, yet these proved ineffective. Meanwhile, subsequent technological advancements have enabled the development of miniature sensors capable of detecting and monitoring a wide range of physiologically relevant parameters; some of these advancements have been integrated into implantable devices, such as pacemakers and defibrillators. However, noninvasive monitoring has recently emerged as an alternative option for patients with HF, offering early congestion detection without requiring an invasive procedure. This review aims to summarize implanted and noninvasive devices, their characteristics, monitored parameters, and potential limitations and challenges around their integration into routine clinical practices.
Coronary microcirculatory dysfunction (CMD) after percutaneous coronary intervention (PCI) in patients suffering from acute myocardial infarction (AMI) may adversely affect prognosis. The objective of this study was to assess the postoperative microcirculatory status and to construct a predictive model for CMD.
This study is a retrospective analysis of 187 AMI patients who underwent PCI at Xuanwu Hospital. Patients were divided into two cohorts based on postoperative angiography-derived microcirculatory resistance (AMR) values: a non-CMD group (AMR <250 mmHg*s/m, n = 93) and a CMD group (AMR ≥250 mmHg*s/m, n = 76). Clinical and laboratory data were extracted, predictive models were constructed and risk factors associated with CMD were identified through the implementation of LASSO regression analyses.
The non-CMD group (n = 93) had a significantly lower body mass index (BMI) (25.40 ± 2.84) and a higher proportion of males (91.4%) compared to the non-CMD group (n = 76) (BMI: 26.64 ± 3.74, p < 0.05; males: 78.9%, p < 0.05). The non-CMD group also exhibited lower Creatine Kinase (CK) levels, glucose levels (GLU), mean platelet volume (MPV), and platelet distribution width (PDW). LASSO regression identified significant predictors of CMD after PCI in AMI patients. A nomogram showed excellent predictive performance (area under curve (AUC): 0.737) and higher net benefit compared to individual models.
The predictive model developed in this study effectively identifies the risk of microcirculatory dysfunction in AMI patients after PCI, providing important insights for clinical decision-making. Future research should further validate the external applicability of this model and explore its potential in clinical practice.
NCT06062316, https://clinicaltrials.gov/study/NCT06062316?term=NCT06062316&rank=1, registration time: December 21, 2023.
This study aimed to reveal the age- and gender-related differences in left ventricular function among patients with normal cardiac structure.
A retrospective analysis was performed on 10,853 individuals with normal cardiac structures undergoing transthoracic echocardiography (2017–2020). We performed distribution analysis using kernel density estimation with Gaussian kernels and created smooth trajectories based on generalized additive models. Moreover, correlation analysis and multivariable regression were applied to evaluate the impact of age and gender on ventricular function.
A weak but statistically significant correlation was found between age and ejection fraction (B-coefficient = –0.077, p < 0.001). Females presented with a higher early diastolic mitral inflow velocity (E)/ early diastolic mitral annular tissue velocity (e') ratio than males across all age decades (p < 0.001). However, age demonstrated stronger associations with functional parameters in individuals below 51.4 years (both genders, p < 0.001). Multivariable regression analysis indicated that age and the male gender were independent predictors of reduced septal and lateral e' velocities (both p < 0.001), with males showing lower values (septal B-coefficient = –0.290; lateral B-coefficient = –0.463).
This study provided the distribution of left ventricular systolic/diastolic function across age decades in males and females and highlighted the clinical importance of monitoring ventricular function even for patients with normal cardiac structure.
Left atrial appendage thrombosis (LAAT), a contraindication to catheter ablation (CA), is a major problem in patients with non-valvular atrial fibrillation (AF). This study aimed to investigate the dynamics of LAAT and identify the factors associated with resistance to LAAT resolution over a 12-month period.
A total of 83 of the 2766 patients with AF who underwent transesophageal echocardiography (TEE) before CA (median age, 62 years; 49 men) participated in follow-up studies. All patients received oral anticoagulants (OACs) and underwent a general clinical examination, which included a complete blood count, biochemical tests, and transthoracic echocardiography. In total, 39 patients (47%) had paroxysmal AF, and 44 patients (53%) had persistent AF.
Patients were divided into two groups based on dynamic TEE monitoring: Group 1 (n = 45), comprising patients whose LAAT resolved within 12 months, and Group 2 (n = 38), consisting of patients whose LAAT persisted until the end of the follow-up study. No significant differences were observed in age, sex, and incidence of cardiovascular disease between the groups. However, Group 2 patients were more likely to administer beta-blockers, diuretics, and rivaroxaban at the start of the study. The OACs were altered in 65 patients due to the repeated detection of LAAT. Comparative analysis revealed that Group 2 patients had higher right atrial volume index, N-terminal pro-brain natriuretic peptide (NT-proBNP) levels, mean platelet volume (MPV), platelet distribution width, and platelet–large cell ratio. Multivariate logistic regression analysis was used to derive a prediction model for LAAT resistance, which included three independent predictors: diuretics intake (odds ratio (OR) 3.800, 95% confidence interval (CI) 1.281–11.275; p = 0.016), NT-proBNP level (OR 1.001, 95% CI 1.000–1.001; p = 0.015), and MPV (OR 1.892, 95% CI 1.056–3.387; p = 0.032). The receiver operating characteristic (ROC) analysis confirmed the good quality of the model: Area under the ROC curve (AUC) 0.789, specificity 72.7%, and sensitivity 73.3%.
This study confirmed that approximately 47% LAAT remains resistant to lysis 1 year after initial detection in patients with AF, regardless of the use of OACs. To our knowledge, this is the first time that platelet morphofunctional parameters, particularly MPV, have been identified as predictors of LAAT lysis resistance, and further research in this direction is needed.
Subclinical systolic dysfunction due to diabetic microangiopathy and its impact on left ventricular (LV) function remains unclear. Myocardial deformation (strain) imaging can detect LV systolic dysfunction earlier than conventional ejection fraction evaluations. Thus, this study aimed to examine the relationship between uncontrolled diabetes and impaired LV global longitudinal strain (GLS) in patients with diabetes mellitus (DM) compared to non-diabetic individuals.
A total of 76 asymptomatic patients with uncontrolled type 2 DM and 76 age- and gender-matched healthy controls underwent transthoracic echocardiography imaging. Patients with coronary artery disease, an LV ejection fraction <55%, atrial fibrillation, or inadequate echocardiographic quality were excluded. The presence of proliferative retinopathy, microalbuminuria, nephropathy, or peripheral neuropathy defines diabetic microvascular complications.
The absolute GLS% was significantly lower in the uncontrolled diabetic group (–18.4 ± 1.7) compared to controls (–22 ± 1.9, p < 0.001). Diabetic patients with complications had lower absolute GLS% values of –18.9 ± 1.7 for no complications, –17.5 ± 1.3 for one complication, and –16.8 ± 1.3 for two or more complications (p-value = 0.001). Regression analysis showed a positive association between complications and lower absolute GLS% (β = 0.41, p < 0.001). No significant difference was found in LV mass between hypertensive (155.1 ± 40.4) and non-hypertensive individuals (139.8 ± 44.3; p-value = 0.19).
Uncontrolled diabetes and the presence of complications were associated with lower absolute GLS% values, suggesting impaired myocardial deformation. These findings highlight the importance of monitoring GLS% as a potential marker for cardiac involvement in diabetic patients.
As the use of wearable devices continues to expand, their integration into various aspects of healthcare becomes increasingly prevalent. Indeed, significant advancements have been made in the field of cardiology through the application of wearable technology to monitor heart rate, rhythm, and other biological signals. This review examines the various applications of wearable technology in cardiology, with the goal of improving patient care. We evaluate the accuracy and functionality of existing wearable electrocardiograms, defibrillators, blood pressure monitors, fitness trackers, activity trackers, and sleep trackers, including their roles in cardiac rehabilitation. Furthermore, we highlight the significant advancements in wearable electrocardiograms, demonstrating their accuracy comparable to that of traditional monitoring devices, as shown by studies such as the Apple Heart Study and the Fitbit Heart Study. Recent research suggests that wearable electrocardiograms are comparable to conventional monitoring devices in terms of performance and can help reduce healthcare costs. However, as technological improvements continue to evolve, challenges related to accessibility, patient privacy, and the need for improved accuracy are also emerging. This review highlights recent advancements that aim to address these challenges. Nonetheless, further research is crucial to critically assess and identify shortcomings, as wearable devices possess significant potential to enhance cardiovascular and overall health.
Atrial fibrillation (AF) is the most common arrhythmia worldwide, characterized by uncoordinated atrial activation leading to a loss of effective atrial contraction and increased risk for atrial thrombi formation, promoting an increased risk of cardioembolic strokes and mortality, and associated increased healthcare expenditure. Therefore, stroke prevention represents a key focus in managing patients with atrial fibrillation, and strategies to achieve this aim have drastically evolved over the years. Previously, aspirin and warfarin were the cornerstone of stroke prophylaxis. However, direct oral anticoagulants have emerged and are now recognized as a safer and more effective alternative for non-valvular AF. Meanwhile, newer non-pharmacological methods to prevent AF related strokes, such as left atrial appendage occlusion devices, have been approved to ameliorate the need for lifelong anticoagulation in patients with elevated bleeding risks. This review outlines the current recommendations and provides an overview of the literature on stroke prevention in patients with atrial fibrillation, particularly focusing on using direct-acting oral anticoagulants. Comparisons between these agents and special considerations for use are also reviewed.