Entropy derived from late gadolinium enhancement (LGE) has been shown to correlate with major adverse cardiac events (MACEs) in various cardiac diseases. However, the association between myocardial entropy and MACEs in patients with hypertension (HTN) has not been reported.
This study recruited 190 patients with high cardiovascular risk and essential HTN who underwent cardiac magnetic resonance (CMR) examination in our hospital between January 2020 and June 2024. HTN patients were followed up for MACEs, which were defined as hospitalization for the occurrence of heart failure, acute coronary syndromes, stroke, or all-cause death. Patients were divided into MACE and non-MACE groups. Cardiac morphology, function, and tissue characteristics were assessed using CMR, and left ventricular (LV) entropy was acquired from LGE images.
Of the 190 patients with HTN, 54 (28.4%) experienced a MACE over a median follow-up period of 12.0 (8.0–27.0) months. LV entropy was significantly higher in patients with MACEs than those without (5.75 ± 0.89 vs. 5.12 ± 1.26; p < 0.001). Furthermore, LV entropy was an independent predictor of MACE, even after adjustment for clinical risk factors (odds ratio: 1.569 (1.039–2.369); p = 0.032). Receiver operating characteristic curve (ROC) analysis showed the predictive value of LV entropy, with an area under the curve (AUC) of 0.663. Adding LV entropy to the clinical model resulted in a relatively higher AUC (0.813 vs. 0.806) for the prediction of MACEs; however, this was not significantly different from the clinical model alone (p = 0.570).
HTN patients with MACEs presented higher LV entropy than patients without MACEs. Furthermore, as an independent predictor of MACEs, LV entropy may help the risk stratification of HTN patients with high cardiovascular risk.
ChiCTR2100049160, https://www.chictr.org.cn/showproj.html?proj=130381.
Transcatheter aortic valve implantation (TAVI) is increasingly utilized for patients with pure aortic regurgitation (PAR). A significant clinical challenge in this patient population is the need for permanent pacemaker implantation (PPI), which occurs frequently post-TAVI and can impact cardiac conduction and rhythm management. This study aimed to explore the effects of PPI on short-term mortality, rates of adverse events, and cardiac function in PAR patients following TAVI.
This retrospective study, conducted in a single center, included 69 PAR patients who underwent TAVI from January 2021 to December 2023. Patients were categorized into two groups: those who received a permanent pacemaker (PM) and those who did not (NPM). The outcomes measured included complications such as pacemaker pocket hematoma and infection, changes in postoperative left ventricular ejection fraction (LVEF) and left ventricular end-diastolic diameter (LVEDD) at 6 months, as well as rates of rehospitalization and mortality.
No significant differences were noted in baseline characteristics or complications between the PM and NPM groups (p > 0.05). The types of PPI and associated complications were also comparable. There was no significant disparity in the incidence of all-cause mortality (PM: 12%, NPM: 11.36%, p = 0.755), major bleeding (PM: 4%, NPM: 4.55%, p = 0.612), or cerebral embolism (PM: 12%, NPM: 4.55%, p = 0.506) between the two groups at 6 months post-TAVI. Additionally, readmission rates were similar at 1, 3, and 6 months following the procedure. Multinomial logistic regression analysis revealed that age (p = 0.020), history of cerebral infarction (p = 0.015), and hypertension (p = 0.019) were significant predictors of mortality. The survival curve indicated that fatalities in the NPM group predominantly occurred during the perioperative period. At the 6-month follow-up, there was no significant difference in survival rates between the two groups (p = 0.971). Regarding cardiac function, irrespective of PPI, a decreasing trend in LVEDD (PM: –4.19 mm, NPM: –6.16 mm, p = 0.000) and an increasing trend in LVEF (PM: +2.19%, NPM: +2.74%, p = 0.053) were observed.
This study was the first to investigate the effects of PPI on the short-term mortality, adverse events, and cardiac function of PAR after TAVI. The results indicated that for PAR, advanced age and previous cerebral embolism increase the mortality after TAVI; however, PPI was not associated with mortality and adverse events after 6 months.
This study aimed to investigate the correlation between calcific aortic valve disease (CAVD) and carotid artery elasticity using ultra-fast pulse wave velocity (UFPWV) technology. Early detection of alterations in carotid artery elasticity, coupled with the prompt implementation of intervention strategies, can effectively decrease the incidence of cardiovascular diseases.
Patients with CAVD were recruited from the University-Town Hospital of Chongqing Medical University and placed in the observation group. Meanwhile, an equivalent number of patients with non-calcified aortic valve disease were recruited as controls. All participants underwent comprehensive health assessments, including measurements of blood lipids, fasting blood sugar, and other biochemical indicators. Additionally, bilateral carotid intima-media thickness (CIMT) was measured, as well as pulse wave velocity (PWV) at the beginning of systole (PWV-BS) and the end of systole (PWV-ES). Differences in various indicators between the two groups were analyzed, and the factors associated with CAVD and carotid artery elasticity were investigated. The correlation between CAVD and carotid artery elasticity was also evaluated.
Patients with CAVD exhibited significantly higher CIMT, PWV-BS, and PWV-ES levels than those with non-calcified aortic valve disease (p < 0.01). PWV-BS and PWV-ES showed progressive increases according to the severity of calcification. Coronary atherosclerotic heart disease and PWV-BS were all identified as independent risk factors for CAVD. The risk factors associated with PWV-BS include hypertension, coronary atherosclerotic heart disease, total cholesterol, and homocysteine (p < 0.05 for all). The risk factors related to PWV-ES include hypertension, coronary atherosclerotic heart disease, total cholesterol, and glycated hemoglobin (p < 0.05 for all).
UFPWV technology is a novel method for the early diagnosis of carotid elasticity. Evaluating carotid artery atherosclerosis in patients with CAVD may lead to earlier detection and intervention and reduce the incidence of cardiovascular events.
About 20% of patients with coronary artery disease (CAD) experience adverse events within five years of undergoing percutaneous coronary intervention (PCI) for acute myocardial infarction. In these patients, the impact of metformin on long-term prognosis remains uncertain.
This study enrolled 22 metformin (Met)-CAD patients with diabetes mellitus (DM) who had been administered metformin for at least six months before PCI, 14 non-Met CAD-DM patients with DM who had never taken metformin or had stopped taking metformin for a year before PCI, and 22 matched healthy controls. A 5-year follow-up was conducted to collect clinical prognosis data. Fecal 16S rRNA sequencing and serum untargeted metabolomics analyses were performed. BugBase was utilized to analyze the possible functional changes in the gut microbiome. Multi-omics analysis was conducted using Spearman’s correlation to explore the interactions between metformin, gut microbiome, serum metabolites, and clinical prognosis.
Metformin significantly lowered the 5-year major adverse cardiac events (MACEs) in Met CAD-DM patients. We found a higher abundance of Bacteroides coprocola, Bacteroides massiliensis, Phascolarctobacterium succinatutens, and Eubacterium coprostanoligenes in the Met CAD-DM patients, as well as an increase in hydroxy-alpha-sanshool (HAS) and decenoylcarnitine and a decrease in tridec-10-enoic acid, Z-vad-fmk (benzyloxycarbonyl–Val–Ala–Asp (OMe)–fluoromethylketone), 3,9-dimethyluric acid in blood serum. Multi-omics analysis revealed that alterations in the gut microbiome and serum metabolites are significantly associated with the 5-year prognosis of CAD-DM.
Metformin significantly improved the 5-year prognosis of CAD patients following PCI. Metformin tended to have more positive effects on the commensal flora and metabolic profiles, which may explain its beneficial effects on cardiovascular health. This study revealed the potential associations between metformin and the gut microbiome, an associated alteration in serum metabolome, and the impact on the host immune system and metabolic pathways.
Clinical observations have shown that cases of stroke or thromboembolism are not uncommon even in the absence of atrial fibrillation, suggesting that atrial fibrillation is a delayed marker of atrial thrombus formation. Atrial cardiomyopathy (ACM) is a pathophysiological concept characterized by atrial substrate and functional abnormalities closely associated with atrial myopathy, atrial enlargement, and impaired ventricular diastolic function. It is an independent factor for thromboembolic stroke, increasing the risk of serious complications such as atrial fibrillation, heart failure, and sudden cardiac death. ACM is likely to be a potential cause of embolic stroke, especially cryptogenic stroke, and early identification of patients at high thromboembolic risk is essential to guide anticoagulation therapy. Although the pathogenesis of ACM has not been fully elucidated, prospective mechanism-based studies have revealed the important role of activated cardiac immune cells along with inflammatory responses, oxidative stress, and other factors in its progression. Exploring the role of immune regulation in the pathogenesis of ACM provides new insights into the underlying mechanisms of cerebrovascular events of cardiac thromboembolic origin. This review summarizes the mechanisms by which immune regulation is involved in the progression of ACM and provides useful insights for future clinical diagnosis and treatment.
The primary objective of this research was to determine the predictive value of the residual SYNTAX (Synergy Between Percutaneous Coronary Intervention With Taxus and Cardiac Surgery) score II (rSS-II) for long-term outcomes in individuals with complex coronary artery disease (CAD) and chronic renal insufficiency (CRI) who have undergone percutaneous coronary intervention (PCI).
A total of 1161 consecutive patients with complex CAD and CRI after PCI were retrospectively recruited from Cangzhou Central Hospital affiliated with Hebei Medical University between January 2014 and September 2017. The patients were stratified into three categories based on rSS-II tertiles: low rSS-II (n = 388), medium rSS-II (n = 389), and high rSS-II (n = 384). The primary endpoints were all-cause mortality (ACM) and cardiac mortality (CM), while the secondary endpoint was major adverse cardiovascular and cerebrovascular events (MACCEs), which included ACM, myocardial infarction, stroke, or unplanned revascularization. The discrimination, calibration, and clinical utility of the rSS-II for predicting long-term outcomes were examined.
The median follow-up period was 37 months (19 to 61 months). The Kaplan–Meier estimate rates of ACM (2.4% vs. 5.9% vs. 13.9%; p < 0.001) and CM (1.9% vs. 2.8% vs. 9.2%; p < 0.001) revealed significant differences among the three categories. Multivariate Cox regression analysis demonstrated that the rSS-II could independently predict ACM (hazard ratio: 1.08, 95% confidence interval: 1.04–1.12; p < 0.001) and CM (hazard ratio: 1.07, 95% confidence interval: 1.02–1.12; p = 0.009). The rSS-II performed satisfactorily in both discrimination (area under the curve for ACM and CM was 0.710 and 0.728, respectively) and calibration (Greenwood–Nam–D’ Agostino goodness-of-fit test for long-term outcomes; p > 0.05 for all). Additionally, decision curve analysis showed that the rSS-II had a high net benefit for long-term outcomes over threshold probabilities, indicating its superiority in daily practice.
The rSS-II is beneficial for predicting and stratifying the risk of long-term outcomes in individuals with complex CAD and CRI who have undergone PCI.
Ultrafiltration (UF) is an alternative approach to diuretic therapy for the treatment of acute heart failure (AHF), but its optimal endpoint is unclear. This study explores using non-invasive ultrasonic cardiac output monitor (USCOM) to determine UF endpoints based on hemodynamic changes.
In this single-anonymized, randomized controlled trial, acute decompensated heart failure patients were randomly assigned to UF (U, n = 20) and USCOM+UF (UU, n = 20) groups at a ratio of 1:1. A mixed linear model was utilized to analyze repeated measurement data of hemodynamic indicators (primary endpoint) in the U and UU groups. A 30% or 50% decrease in B-type natriuretic peptide (BNP) concentrations relative to the baseline was established as the criteria for the UF endpoint success. Multivariate logistic regression was used to identify potential indicators within the USCOM that could have influenced the UF endpoint success. Receiver operating characteristic (ROC) curves were used to evaluate the value of the predictive model. Economic benefits, including treatment costs and hospitalization duration, were also assessed.
Change rates in mean arterial pressure, heart rate (HR), urine output, hematocrit, and BNP concentrations were similar between the U and UU groups over 7 days (all p > 0.05). On day 4, significant correlations were found between various USCOM parameters, including inotropy (INO), systemic vascular resistance index (SVRI), systemic vascular resistance, corrected flow time (FTc), velocity time integral, and the BNP of the UF parameters. Multivariate logistic regression revealed that INO and SVRI were correlated with a 30% reduction in BNP on day 4 compared to baseline, while FTc and HR were found to be independently associated with a 50% reduction in BNP on day 4 compared to baseline. The UF endpoint prediction formula for a 30% reduction in BNP was –2.462 + 0.028 × INO – 0.069 × SVRI, with sensitivities, specificities, and accuracies of 70%, 83%, and 75%, respectively. The UF endpoint prediction formula for a 50% reduction of BNP was –2.640 – 0.088 × FTc – 0.036 × HR, with sensitivities, specificities, and accuracies of 83%, 63.0%, and 72.5%, respectively. The addition of the USCOM significantly reduced treatment costs and hospitalization stay lengths.
Observing the USCOM using probability formulas served to determine appropriate UF endpoints during AHF treatments. UF combined with the USCOM can reduce the costs of UF and hospitalization.
NCT06533124, https://clinicaltrials.gov/study/NCT06533124?term=NCT06533124&rank=1.
Complex high-risk and indicated patients (CHIPs) increase the risk of in-hospital death after percutaneous coronary intervention (PCI). Extracorporeal membrane oxygenation (ECMO) support can improve survival. However, there remains a gap in knowledge regarding how to identify and manage these high-risk patients effectively to reduce mortality. This study aimed to determine the independent high-risk factors associated with increased risk of in-hospital mortality among CHIPs after PCI with ECMO support. This research focused on providing clinicians with more accurate risk assessment tools for devising more effective treatment plans for these patients.
The EMBASE, PubMed, Cochrane Library, Web Of Science, Chinese Biomedical Database, China National Knowledge Infrastructure, China Science and Technology Journal Database, and Wanfang databases were searched from their inception to October 1, 2024, to identify observational studies examining mortality risk amongst adult CHIPs (age ≥18 years). The primary outcome was in-hospital mortality. A meta-analysis used random-effects models to obtain summary odds ratios (ORs) with 95% confidence intervals (CIs). The Cochrane risk-of-bias tool assessed the quality of evidence.
Ten studies with 306 participants were included. In pooled analyses, cardiogenic shock (CS) or cardiac arrest (CA) to ECMO (mean difference (MD) : 34.61, 95% confidence interval (CI): 26.70 to 42.52; p < 0.00001), ECMO duration (MD : –19.93, 95% CI: –32.85 to –7.02; p = 0.002), type of infarction-associated coronary artery-left anterior descending (LAD; OR : 3.16, 95% CI: 1.83 to 5.47; p < 0.0001), body mass index (BMI; MD: 1.52, 95% CI: 1.06 to 1.97; p < 0.00001), lactate levels (MD: 3.15, 95% CI: 2.37 to 3.94; p < 0.00001), left ventricle ejection fraction (LVEF; MD: –4.09, 95% CI: –6.17 to –2.00; p = 0.0001), mean arterial pressure (MAP; MD: –24.92, 95% CI: –32.19 to –17.65; p < 0.00001), heart rate, male sex, left circumflex, and right coronary artery, were associated with in-hospital mortality.
CHIPs with longer CS or CA to ECMO, shorter ECMO duration, LAD infarction, higher BMI, elevated lactate levels, and lower LVEF and MAP have an increased risk of in-hospital death.
Endocarditis can lead to health loss and even death, making it one of the major contributors to the global disease burden, with its incidence continuously increasing. This study aimed to assess the trends and frontier analysis of the worldwide burden of endocarditis over the past 30 years and to improve the predictions of its future burden by 2035.
We analyzed the trends of global endocarditis incidence, prevalence, deaths, and disability-adjusted life years (DALYs) at international, regional, and national levels from 1990 to 2021 using a comprehensive, localized, and multidimensional approach. Clustering analysis assessed the changing patterns of disease burden related to endocarditis in the Global Burden of Disease (GBD) study regions. Correlation analysis was conducted to determine the potential relationships between the burden of endocarditis and the socio-demographic index (SDI) and the Human Development Index (HDI). Frontier analysis was performed to identify possible areas for improvement and the disparities in development status among countries. Additionally, we projected the changes in the burden of endocarditis by 2035.
From a global perspective, between 1990 and 2021, the incidence, prevalence, mortality, and DALYs associated with endocarditis have shown a continuous upward trend. At the national level, significant differences were observed in the incidence, prevalence, mortality, and DALYs of endocarditis worldwide. The United States had the highest number of deaths; India had the highest number of DALYs; Thailand had the highest incidence; Sri Lanka had the highest prevalence. The age-standardized rates (ASRs) for endocarditis prevalence, incidence, mortality, and DALYs increased steadily with age, peaking in the 95-year-old and above age group. The incidence, prevalence, mortality, and DALYs for males were 1.27 times, 1.02 times, 1.06 times, and 1.37 times those of females, respectively. Clustering analysis results indicated a significant increase in the estimated annual percentage change (EAPC) of mortality and DALY rates for endocarditis in East Asia. A significant correlation exists between EAPC and the ASRs of disease burden. Frontier analysis showed that countries and regions with higher SDIs have greater potential for improving the disease burden. The Bayesian age–period–cohort (BAPC) results indicated that the incidence, prevalence, mortality, and DALYs case numbers are expected to increase, with the ASRs for incidence and prevalence also projected to show a continuous upward trend by 2035.
The global burden of endocarditis, a significant public health issue, has shown an overall upward trend from 1990 to 2021. The continuous increase in the prevalence and incidence of endocarditis, driven by population growth and aging, has become a major challenge for its control and management, which may guide better public health policy formulation and the rational allocation of medical resources. This targeted approach is crucial for effectively alleviating the burden of this disease.
Hypertensive disorders of pregnancy (HDP) pose substantial risks to both maternal and fetal health, thereby highlighting the need for precise and comprehensive blood pressure (BP) monitoring methods. Ambulatory blood pressure monitoring (ABPM) offers advantages over traditional office BP measurements by enabling continuous 24-hour assessment, thus capturing circadian BP variations, including nocturnal and morning hypertension, which are often missed when BP is measured in a medical office. This capacity for detailed monitoring allows ABPM to identify specific BP phenotypes, such as normotension, white-coat hypertension, masked hypertension, and sustained hypertension. Each of these phenotypes has unique implications for risk stratification, which helps to identify high-risk pregnancies early and potentially improve outcomes through more targeted interventions. Despite these advantages, three key challenges have limited the widespread adoption of ABPM during pregnancy. First, the complex dynamics in BP variations throughout gestation are influenced by physiological adaptations, such as uterine artery remodeling, which lowers BP before 20 weeks and increases mean arterial pressure after 20 weeks to support fetal growth. Second, adaptive changes in the maternal arterial system alter vascular mechanical properties, complicating accurate BP assessments. Third, diagnostic thresholds specific to pregnancy that are directly linked to adverse pregnancy outcomes are lacking. Therefore, this review addresses the role of ABPM in managing HDP, examining BP dynamics and the suitability of monitoring devices, and ongoing efforts to develop diagnostic thresholds tailored to pregnancy. By exploring these aspects, this review underscores the importance of ABPM in advancing more precise, effective strategies for HDP management and multidisciplinary management programs for pregnant women to enhance clinical decision-making and maternal–fetal outcomes.
This study aimed to use four-dimensional automatic left atrial quantification (4D Auto LAQ) to quantitatively evaluate the morphological and functional changes in the left atrium (LA) in asymptomatic type 2 diabetes mellitus (T2DM) patients with early chronic kidney disease (CKD), and explore its correlation with major adverse cardiovascular event (MACE) occurrence.
This study enrolled patients with asymptomatic T2DM complicated with early CKD. Then, 4D-Auto LAQ was used to evaluate LA volume index (minimum, maximum, pre-ejection) and LA longitudinal and circumferential strains during each of the three LA phases: reservoir, conduit, and contraction. The primary endpoint for follow-up was defined as the first occurrence of nonfatal acute myocardial infarction, stroke, congestive heart failure, or cardiac death. Univariate and multivariate Cox proportional hazard analyses were used to evaluate the correlation between LA parameters and the MACEs in T2DM patients with early CKD.
A total of 361 patients were analyzed (mean age, 59.51 ± 11.17 years). During a median follow-up period of 47 months (interquartile range, 17–59 months), MACEs occurred in 70 patients. After adjusting for various clinical and echocardiographic predictors, increased LA volume and impaired reservoir function (ResF) were each independently associated with the primary endpoint: Left atrium minimum volume index (LAVImin) had an adjusted hazard ratio (HR) of 1.21 (95% confidence interval (CI), 1.08–1.35; p = 0.010), whereas left atrium longitudinal strain during the reservoir phase (LASr) had an adjusted HR of 0.81 (95% CI, 0.74–0.89; p < 0.001). Univariate and multivariate Cox regression analyses indicated that the cumulative incidence of MACEs was significantly greater in patients with LAVImin >16.9 mL/m2 than in those with LAVImin ≤16.9 mL/m2 (HR, 2.25; 95% CI, 1.03–6.39; p = 0.005). Furthermore, patients with a LASr <18.5% faced a markedly elevated risk of MACEs—nearly fourfold greater than individuals with a LASr ≥18.5% (HR, 3.95; 95% CI, 1.76–8.86; p < 0.001).
An enlarged left atrium (LAVImin) and impaired ResF (LASr) are strongly associated with long-term outcomes in T2DM patients complicated with early CKD. LASr showed the strongest associations with the occurrence of MACEs.
Despite evidence suggesting a link between lipoprotein(a) (Lp(a)) and the occurrence of acute myocardial infarction (AMI), the relationship regarding prognoses related to AMI remains unclear. This meta-analysis was conducted to summarize the association between Lp(a) and the risks of major adverse cardiovascular events (MACEs) among populations surviving AMI.
We searched PubMed, Embase, Web of Science, MEDLINE, and Cochrane Library databases until February 14, 2024. Cohort studies reporting multivariate-adjusted hazard ratios (HRs) for the correlation of Lp(a) with MACEs in AMI populations were identified. The Lp(a) level was analyzed using categorical and continuous variables. Subgroup analyses were conducted based on gender, type of AMI, diabetic and hypertensive status. Publication bias was assessed using funnel plots. A random-effect model was utilized to pool the results.
In total, 23 cohorts comprising 30,027 individuals were recruited. In comparison to those categorized with the lowest serum Lp(a), individuals in the highest category showed higher risks of MACEs after AMI (HR: 1.05, 95% confidence interval (CI): 1.01–1.09, p = 0.006). Similar findings were exhibited when Lp(a) was analyzed as a continuous variable (HR: 1.14, 95% CI: 1.02–1.26, p = 0.02). Subgroup analyses indicated that this correlation persisted significantly among females (HR: 1.23, p = 0.005), diabetes mellitus (DM) (HR: 1.39, p = 0.01), hypertension (HR: 1.36, p < 0.00001), ST-segment elevation myocardial infarction (STEMI) (HR: 1.03, p = 0.04), non-STEMI (HR: 1.40, p = 0.03), and long-term (>1 year) MACE (HR: 1.41, p = 0.0006) subgroups.
Higher Lp(a) levels might be an independent indicator for MACE risks after AMI, especially among female populations with DM and/or hypertension, and more suitable for evaluating long-term MACEs.
CRD42024511985, https://www.crd.york.ac.uk/prospero/display_record.php?ID=CRD42024511985.
Acute aortic dissection (AAD) is a rare but life-threatening disease, and its rapid and correct diagnosis is important. Heart rate (HR) is a risk factor for death in patients with AAD, but their relationship remains unknown. This meta-analysis aimed to evaluate whether there was a significant correlation between HR and AAD mortality risk.
By searching PubMed, Embase, and Web of Science databases, the studies reporting the correlation between HR and AAD were obtained, and their methodological quality was evaluated. Relative risk (RR) with 95% confidence interval (CI) was used as the effect size. Subgroup analysis, sensitivity analysis, and publication bias test (Egger’s test and funnel chart) were used to find the source of heterogeneity and evaluate the stability of the results.
Ten studies enrolling >4000 patients were included. Increased HR was positively correlated with increased AAD mortality risk (RR [95% CI] = 1.04 [1.01–1.07], p = 0.006). There was significant statistical heterogeneity among the included studies. The timing of HR monitoring, AAD type, and follow-up time were sources of heterogeneity. Sensitivity analysis showed that the combined results were stable. There was a significant publication bias in the included studies; however, the shear-fill method showed that the publication bias had little effect on the combined results (RR [95% CI] = 1.038 [1.010–1.066], p = 0.008).
There was a positive relationship between increased HR and increased AAD mortality.
Prior research on the relationship between iron status and arterial stiffness is limited, with causality still unclear. However, understanding these connections is crucial for improving the prevention and management of arterial stiffness. Therefore, this study aimed to examine the impact of iron status and other micronutrients on arterial stiffness risk using Mendelian randomization (MR) approaches.
MR was performed utilizing genome-wide association studies (GWAS) data from European populations to investigate the causal link between various nutrients (iron, etc.) and arterial stiffness index. We selected the random-effects inverse-variance weighting (IVW) approach for the primary analysis and conducted numerous sensitivity tests to ensure accuracy.
This study found a causal effect of genetically predicted high levels of serum iron (β = 0.069, 95% confidence interval (CI) [0.031 to 0.107], pFDR = 1.87 × 10-3) [false discovery rate, FDR], ferritin (β = 0.143, 95% CI [0.050 to 0.235], pFDR = 8.28 × 10-3), and transferrin saturation (β = 0.053, 95% CI [0.025 to 0.080], pFDR = 1.29 × 10-3) on arterial stiffness index. There was no evidence of reverse causality. Associations derived from multivariate MR analyses remained significant after adjusting for potential confounders. Zinc and carotene levels may be inversely linked with arterial stiffness.
This study provides a genetic basis for the causal relationship between elevated iron status and increased arterial stiffness, suggesting the important role of micronutrients in the disease process.
The platelet-to-lymphocyte ratio (PLR) is applied as a potential first-line prognostic predictor for many cardiovascular diseases due to its simplicity and accessibility. This meta-analysis aimed to quantify the predictive power of PLR for major adverse cardiovascular events (MACEs) in patients with acute coronary syndrome (ACS) undergoing percutaneous coronary intervention (PCI), explore its predictive efficacy in different populations, and identify other potential influencing factors.
PubMed, Embase, Cochrane Library, and Web of Science databases were comprehensively searched for eligible studies until February 7, 2025, based on the inclusion and exclusion criteria. The Newcastle–Ottawa scale (NOS) was employed for quality assessment. Sensitivity, specificity, summary receiving operating characteristic (SROC) and area under the curve (AUC) were combined using Stata 15.1 and Meta-DiSc software. Meta-regression analyses, subgroup analyses, threshold effect analyses, sensitivity analyses, and publication bias tests were performed.
Nine studies (7174 patients) were enrolled. High PLR could predict MACEs in ACS patients undergoing PCI, with 0.68 sensitivity (95% CI, 0.60–0.76), 0.65 specificity (95% CI, 0.57–0.73), and 0.72 AUC (95% CI, 0.68–0.76). Subgroup analyses noted that PLR better predicted MACEs after PCI in ACS patients in the subgroup with a higher proportion of female patients and the subset aged >60 years. Meta-regression analyses unveiled that study type (p < 0.01) and PLR cutoff value (p < 0.01) might be sources of heterogeneity in the sensitivity analyses, while the mean age (p < 0.001) and sex ratio (p = 0.05) might be sources of heterogeneity in the specificity analyses.
High PLR levels have favorable values in predicting in-hospital and long-term MACEs after PCI in ACS patients. The PLR had greater sensitivity and an improved ability to identify risk in patients aged >60 years and the subgroup with a higher proportion of women and was also more sensitive to in-hospital MACEs.
No. CRD42024537586, https://www.crd.york.ac.uk/PROSPERO/view/CRD42024537586.
Cardiovascular diseases, including acute myocardial infarctions, heart failure, hypertension, adverse cardiac remodeling, hypertrophy, atherosclerosis, and coronary artery disease, continue to lead to global mortality rates. Annual global cancer mortality rates follow closely behind, emphasizing the need to develop novel therapeutic approaches. MicroRNAs (miRNAs), a class of short non-coding RNAs, regulate cascades of signaling pathways and their downstream targets, exerting control over numerous biological processes. Dysregulation in specific miRNAs is linked to various pathogenesis, including cancer and cardiovascular disease. Among these miRNAs, the miRNA-17-92 cluster plays versatile roles at the nexus of critical physiological and pathological processes, including cardiac diseases and malignancy. This review aimed to provide a holistic analysis of the current progress in identifying, developing, and utilizing the miRNA-17-92 cluster to combat cardiovascular diseases and cancer. The members of the miRNA-17-92 cluster exert control over numerous cellular pathways that regulate, suppress, and promote various aspects of cardiomyocyte differentiation, regeneration, and aging. Certain pathways controlled by the cluster are protective when properly expressed. Others can propagate unchecked cardiovascular disease progression and mortality due to poorly controlled over/under-regulation. Similarly, the miRNA-17-92 cluster plays critical regulatory roles in the occurrence, metastasis, and prognosis of multiple cancers, which may allow the cluster to serve as diagnostic and prognostic biomarkers of malignancy. This review provides a brief overview of the multifaceted roles of the miRNA-17-92 cluster to deliver some insight into the development of novel targeted therapeutics for cardiovascular diseases and cancer via controlling the expression of specific subsets within this cluster. Additionally, this review systematically summarizes the established molecular mechanisms of the miRNA-17-92 cluster and its therapeutic potential in dual pathological contexts, cardiovascular diseases, and cancer.
Atrial fibrillation (AF) and hypertension are associated with inflammatory response and oxidative stress. Uric acid (UA) is the product of the oxidative reaction and a surrogate indicator of oxidative stress. However, whether UA imposes a greater risk of AF in hypertensive patients remains unclear. This study sought to evaluate the evidence supporting an association between serum uric acid (SUA) and AF in patients with hypertension.
The observational studies in which SUA was measured and AF was reported in hypertension were searched for in the PubMed, Cochrane Library, EMBASE, and Web of Science databases until December 31, 2023, without language restrictions. We calculated the pooled mean difference of SUA between those with and without AF in the hypertension patients.
A total of 5 studies were included. Three cross-sectional studies comprised 4191 patients with hypertension. The standardized mean difference (SMD) of SUA for those with AF was 0.60 (95% confidence interval (CI) 0.15–1.05) compared with those without AF. Two cohort studies evaluated 9810 individuals with hypertension, and the risk of AF was 0.03 (95% CI –0.05–0.11), which revealed no significant difference between high SUA and normal SUA.
Our findings demonstrate a significant association between SUA and AF in patients with hypertension. Further studies are needed to investigate the underlying pathophysiological mechanisms and to assess the value of SUA as a marker or a potential target for the therapy of AF in patients with hypertension.
Intraplaque neovascularization (IPN) correlates significantly with plaque vulnerability and can be detected using Angio PLanewave UltraSensitive imaging technology (Angio PL.U.S.; AP). Several immune–inflammatory biomarkers that reflect the state of inflammation and immune homeostasis in the body are currently used to assess cardiovascular and cerebrovascular diseases. This study aimed to investigate the correlation between carotid IPN scores and several immune–inflammatory indicators in patients with different degrees of coronary artery stenosis.
This study prospectively enrolled 107 patients with coronary artery stenosis confirmed by coronary angiography (CAG). Preoperative ultrasonography was performed to screen for carotid plaques, and AP was conducted to determine whether IPN was present and correctly scored. The levels of immune–inflammatory indicators, plaques, and coronary artery lesions between groups with and without IPN and different IPN scores were analyzed. We utilized logistic regression models to determine the independent predictors of IPN and constructed receiver operating characteristic (ROC) curves. Odds ratios (ORs) and 95% confidence intervals (CIs) were calculated.
Differences in systemic immune inflammation index (SII) levels and plaque thicknesses were found between the groups with and without IPN and between different IPN scores (p < 0.05). The IPN scores were positively correlated with SII levels (r = 0.268, p = 0.005), plaque thickness (r = 0.273, p = 0.005), and Gensini score (r = 0.446, p < 0.001). SII levels (per 10-unit increase) (OR = 1.031) and plaque thickness (OR = 1.897) were independent risk factors for IPN. When the SII was 541 × 109/L and the thickness of the plaque was 2.25 mm, the area under the curve (AUC) was 0.653 and 0.656, respectively. The AUC of the combined diagnosis was 0.711.
Elevated SII levels and increased plaque thickness were associated with the vulnerability of carotid plaques in patients with coronary artery stenosis and may signal increased coronary artery stenosis.
ChiCTR2400094458, https://www.chictr.org.cn/hvshowprojectEN.html?id=266292&v=1.0.
Once considered the “forgotten valve and ventricle”, the tricuspid valve and right ventricle are now recognized as critical structures with significant clinical and prognostic implications. Growing evidence has highlighted that tricuspid regurgitation (TR) and right heart failure are not merely secondary phenomena that resolve following the treatment of left-sided heart disease. Instead, TR and right heart failure contribute to adverse outcomes and increased mortality if left untreated. This paradigm shift has fueled extensive clinical research, leading to a deeper understanding of the pathophysiology of TR and right ventricular (RV) dysfunction. Additionally, advancements in cardiovascular imaging have facilitated early detection, risk stratification, and innovative therapeutic approaches for TR and right heart failure. This article explores the evolving landscape of tricuspid valve disease, emphasizing the importance of early recognition and the role of emerging imaging technologies in improving patient outcomes. Thanks to progress in imaging technology, especially echocardiography, as well as cardiac magnetic resonance and cardiac computer tomography, enhanced studies can be conducted on the tricuspid valve pathology to delineate the various mechanisms involved in TR and RV dysfunction and offer patients a tailored medical, as well as surgical and transcatheter therapies. These unparalleled technological advances would not be possible without the hard work of physicians, scientists, surgeons, interventional cardiologists, and echocardiographers worldwide, despite the many challenges they experience daily and in every procedure. Many patients with TR present at an advanced stage of disease progression, often with severe regurgitation and clinical manifestations associated with poor outcomes. Additionally, a significant proportion of these patients have either undergone previous open-heart surgery for left-sided valvular disease or are considered high-risk surgical candidates due to multiple comorbid conditions. In recent years, transcatheter therapy has emerged as a viable alternative for this high-risk population, offering a less invasive option for those previously deemed “inoperable”. This breakthrough has transformed the therapeutic landscape for valvular heart disease, particularly for TR, providing new hope and improved outcomes for patients who were once left with limited treatment options.
The clinical prognosis of ST-elevation myocardial infarction (STEMI) patients with mural thrombus in left ventricular aneurysm (MTLVA) remains poor; moreover, the risk factors associated with the non-resolution (persistent or recurrent) of MTLVA are not well understood. This study aimed to identify independent risk factors for MTLVA non-resolution.
A total of 133 STEMI patients (mean age 62 ± 11 years, 80.5% male) with MTLVA, admitted to our department between 2014 and 2022, were included in this retrospective analysis. Patients were categorized into two groups: resolution (n = 59) and non-resolution [persistent (n = 72) or recurrent (n = 2) MTLVA; n = 74]. The median follow-up duration was 25 months, during which adverse events were monitored, including stroke, re-revascularization, major bleeding, systemic embolism, and cardiac death.
The prevalence of non-resolution was 55.6%. Non-resolution was significantly associated with elevated lipoprotein (a) [Lp(a)] levels (>270 mg/L, hazard ratios (HR) 2.270, p = 0.003), larger left ventricular aneurysm (LVA) area (>4.5 cm2, HR 4.038, p < 0.001), and greater mural thrombus (MT) area (>2.2 cm2, HR 2.40, p = 0.002), independent of other risk factors, such as hypercholesterolemia and left circumflex artery (LCX)-related STEMI. Baseline left ventricular ejection fraction (LVEF) was lower in the non-resolution group (41.7% vs. 45.7%, p = 0.008). During follow-up, the LVEF remained lower in the non-resolution group and increased in the resolution group. The composite of adverse events was significantly higher in the non-resolution group (28.4% vs. 8.5%, p = 0.003), including stroke (p = 0.025) and systemic embolism (p = 0.034).
Independent risk factors for thrombus non-resolution in STEMI patients with MTLVA include elevated Lp(a), larger LVA and MT areas. These factors contribute to thrombus persistence and are associated with worse clinical outcomes. However, further studies are needed to assess targeted management strategies for high-risk patients.
Takotsubo syndrome (TTS), also known as stress-induced cardiomyopathy or “broken heart syndrome”, is characterized by transient left ventricular dysfunction, often triggered by emotional or physical stress. Emerging evidence suggests that sleep-disordered breathing (SDB) and sleep disruption may play a significant role in the pathophysiology and exacerbation of TTS. This review explores the influence of conditions such as obstructive sleep apnea (OSA), insomnia, and other sleep disturbances on the onset and progression of TTS. SDB, particularly OSA, is marked by repetitive episodes of upper airway obstruction during sleep, leading to intermittent hypoxia and increased sympathetic nervous system activity. These physiological changes can trigger or exacerbate TTS by promoting myocardial stress and impairing autonomic regulation. Insomnia and other forms of sleep disruption also contribute to heightened sympathetic activity and elevated stress hormone levels, which may precipitate TTS in susceptible individuals. Thus, this review synthesizes current research on the mechanisms linking sleep disturbances to TTS, highlighting the impact of nocturnal hypoxia, sleep fragmentation, and autonomic dysregulation. Moreover, this review discusses the clinical implications of these findings, emphasizing the need to screen and manage sleep disorders in patients with or at risk of TTS. Addressing sleep disturbances through therapeutic interventions may reduce the incidence and recurrence of TTS, offering a novel approach to managing this condition. In conclusion, this review underscores the importance of recognizing and treating SDB and sleep disruption as potential contributors to Takotsubo syndrome. Future research should focus on elucidating the precise mechanisms involved and determining effective strategies for integrating sleep management into the care of patients with TTS.
Anemia and iron deficiency (ID) are common in patients with acute myocardial infarction (AMI), especially those in intensive care units (ICU). This study investigated the impact of hemoglobin (Hb) and ID on the short-term mortality of critically ill patients with AMI.
Overall 992 AMI patients with their first ICU admission were included in this analysis. ID was defined as serum ferritin <100 ng/mL or transferrin saturation (TSAT) <20%. Patients were categorized into four groups according to their Hb concentrations and the presence of ID. Kaplan-Meier survival analysis was used to assess differences in all-cause mortality between the different groups, and Cox regression models to identify risk factors for all-cause mortality.
Anemia was present in 89.5% of patients, while 65.9% suffered from ID. Patients in the group with Hb <9 g/dL and without ID were the youngest, yet they exhibited the highest severity scores. The Kaplan–Meier analysis showed that this group had a higher rate of all-cause mortality compared to the other three groups (Log-rank test p = 0.005). Moreover, multivariate Cox regression analysis revealed that Hb <9 g/dL and no ID was associated with a higher risk of all-cause mortality at 120 days (hazard ratio 1.512, 95% confidence interval 1.031–2.217, p = 0.034) when compared to the reference group (Hb ≥9 g/dL and no ID). Additionally, multivariate Cox regression analysis showed that lower Hb was linked to increased rates of all-cause mortality at 30, 60, 90, and 120 days. Elevated levels of ferritin and TSAT were also associated with increased all-cause mortality at 60, 90, and 120 days. Compared to patients without ID, those with ID had a decreased risk of all-cause mortality at 60, 90, and 120 days.
Anemia and ID were prevalent in ICU patients with AMI. Patients with Hb <9 g/dL and without ID showed higher 120-day all-cause mortality. Additionally, lower Hb, elevated ferritin, and increased TSAT levels were identified as significant risk factors for short-term all-cause mortality in these patients.
Transcatheter aortic valve replacement (TAVR) has emerged as the preferred treatment for symptomatic severe aortic stenosis (AS). However, China’s unique patient population presents distinct challenges, including a higher prevalence of bicuspid aortic valves (BAVs) and severe valve calcification. This study used real-world clinical data from Chinese patients to assess the safety and efficacy of the SAPIEN 3 balloon-expandable transcatheter heart valve (THV) in TAVR, particularly in patients with BAVs.
This retrospective, multicenter study enrolled consecutive severe AS patients treated with SAPIEN 3 THVs via a transfemoral approach from June 2020 to March 2024. The primary endpoint was 30-day mortality, while secondary endpoints included procedural mortality, procedural success, conversion to surgery, coronary artery occlusion, THV-in-THV deployment, permanent pacemaker implantation, and paravalvular leaks (PVLs).
Among the 1642 enrolled patients, 56.0% had BAVs, and 44.0% had tricuspid aortic valves (TAVs). The 30-day mortality rate was 0.90%. Propensity score matching revealed no statistically significant differences between patients with BAVs and TAVs in terms of 30-day mortality (odds ratio (OR): 1.51, 95% confidence interval (CI): 0.42 to 5.36; p = 0.531), immediate procedural mortality, procedural success, coronary artery occlusion, THV-in-THV deployment, permanent pacemaker implantation, or moderate to severe PVLs. However, a significant difference was found in the conversion rate to open surgery (OR: 5.07, 95% CI: 1.11 to 23.2; p = 0.036).
This study demonstrates the safety and feasibility of SAPIEN 3 balloon-expandable THVs in TAVR for Chinese patients with severe AS, including those with BAV stenosis. These findings challenge historical relative contraindications for TAVR in BAV patients and highlight the potential of TAVR in diverse patient populations. Larger prospective studies with extended follow-ups are needed to refine patient selection and evaluate longer-term outcomes.
Cardiac rehabilitation (CR) serves as a critical component in ongoing care for cardiovascular disease patients, improving postoperative anxiety and depression in cardiac surgery patients while reducing readmission rates and mortality. However, patient completion rates for CR programs remain low due to insufficient awareness and lack of social support. This study aimed to investigate the impact of family support levels on self-management behaviors in postoperative cardiac surgery patients, providing a basis for family-based cardiac rehabilitation interventions.
This cross-sectional survey involved 76 patients who had undergone major vascular surgeries one month prior and were subsequently discharged from the hospital’s cardiology department. Participants completed questionnaires assessing demographic details, family support, psychological status, and self-management practices. Logistic regression analysis identified factors influencing perceived social support from family (PSS-Fa), while correlation analyses examined relationships between family support and self-management behaviors.
The mean PSS-Fa score was 10.82 ± 1.50, and the average self-management behavior score was 140.80 ± 20.46. Female gender, marital status, and educational attainment significantly influenced higher family support scores (p < 0.05). For the univariate analysis, key determinants of better self-management included age, educational level, marital status, household income, type of medical insurance, presence of comorbidities, cardiac function classification, and psychological states indicative of anxiety or depression (all p < 0.05). Multiple linear regression analysis showed that PSS-Fa, age, and education level significantly influenced self-management behaviors in postoperative cardiac patients. Family support and education level had a positive effect, while age had a negative impact. The model’s overall fit statistics are R2 = 0.821 and F = 33.722 (p < 0.05). Pearson’s correlation analysis revealed a positive association between family support and overall self-management behaviors (r = 0.303, p < 0.05), particularly in nutrition management, exercise adherence, self-monitoring, and timely medical consultations.
This suggests that the role of family support should be fully considered in developing CR programs in the future, and targeted interventions should be implemented to enhance this support, thereby potentially improving patient outcomes and adherence to CR programs.
The study aimed to develop an interpretable machine learning (ML) model to assess and stratify the risk of long-term major adverse cardiovascular events (MACEs) in patients with premature myocardial infarction (PMI) and to analyze the key variables affecting prognosis.
This prospective study consecutively included patients (male ≤50 years, female ≤55 years) diagnosed with acute myocardial infarction (AMI) at Tianjin Chest Hospital between January 2017 and December 2022. The study endpoint was the occurrence of MACEs during the follow-up period, which was defined as cardiac death, nonfatal stroke, readmission for heart failure, nonfatal recurrent myocardial infarction, and unplanned coronary revascularization. Four machine learning models were built: COX proportional hazards model (COX) regression, random survival forest (RSF), extreme gradient boosting (XGBoost), and DeepSurv. Models were evaluated using concordance index (C-index), Brier score, and decision curve analysis to select the best model for prediction and risk stratification.
A total of 1202 patients with PMI were included, with a median follow-up of 26 months, and MACEs occurred in 200 (16.6%) patients. The RSF model demonstrated the best predictive performance (C-index, 0.815; Brier, 0.125) and could effectively discriminate between high- and low-risk patients. The Kaplan-Meier curve demonstrated that patients categorized as low risk showed a better prognosis (p < 0.0001).
The prognostic model constructed based on RSF can accurately assess and stratify the risk of long-term MACEs in PMI patients. This can help clinicians make more targeted decisions and treatments, thus delaying and reducing the occurrence of poor prognoses.
Previous studies have indicated that blood lipids can influence skeletal health. However, limited research exists on the impact of serum apolipoprotein B (ApoB) on bone mineral density (BMD); meanwhile, it remains unclear to what extent cardiovascular disease plays in mediating this process.
Therefore, we conducted a cross-sectional analysis involving 2930 participants from the National Health and Nutrition Examination Survey (NHANES) database to explore the relationship between serum ApoB and total body BMD (TB-BMD) and lumbar spine BMD (LS-BMD). We employed a two-step, two-sample Mendelian randomization (MR) analysis using genetic instruments to investigate causality and assess the mediating effects of six cardiovascular diseases.
Multivariable linear regression models demonstrated an inverse linear association between serum ApoB and TB-BMD (β = –0.26, 95% confidence interval (CI): –0.41 to –0.12, p < 0.001; p for non-linearity = 0.771) and LS-BMD (β = –0.53, 95% CI: –0.75 to –0.31, p < 0.001; p for non-linearity = 0.164). The primary analysis utilized the multiplicative random effects inverse variance weighted (IVW-MRE) method for the two-sample MR analysis. The results demonstrated a causal relationship between serum ApoB with TB-BMD (β = –0.0424, 95% CI: –0.0746 to –0.0103; p = 0.0096) and LS-BMD (β = –0.0806, 95% CI: –0.1384 to –0.0229; p = 0.0062). The two-step MR analysis indicated heart failure as a mediating factor in the causal relationship between serum ApoB and TB-BMD, with a mediation proportion of 18.69%.
The results of this study support that lowering serum ApoB levels could enhance BMD while preventing the occurrence of heart failure might reduce the harm caused by the decrease in BMD due to elevated ApoB levels.
Atrial fibrillation (AF) is the most common supraventricular arrhythmia, affecting 2–3% of the adult population, with an increasing prevalence due to demographic shifts; however, detection methods have also improved. This rhythm disorder is associated with significant morbidity, manifesting through symptoms that worsen the quality of life, as well as with adverse outcomes and increased mortality. The substantial AF burden on the healthcare system necessitates the development of effective and durable treatment strategies. While pharmacological management represents the first-line approach for AF, the limitations associated with this approach, including side effects and insufficient efficacy, have promoted the evolution of catheter ablation techniques that isolate pulmonary veins (PVs) and, thus, disrupt arrhythmia-causing impulses from the atria. Currently, three energy sources have gained U.S. Food and Drug Administration (FDA) and European regulatory approval (The Conformité Européene (CE) mark certification) for catheter ablation: radiofrequency ablation (RFA), cryoballoon ablation (CBA), and, more recently, pulsed-field ablation (PFA). RFA has subsequently become an effective treatment, demonstrating superior outcomes in randomized controlled trials compared to antiarrhythmic drug therapy. CBA has also proven to be a safe and effective alternative, particularly for patients with symptomatic paroxysmal AF, showing comparable efficacy to RFA and similar rates of complications. Meanwhile, PFA is emerging as a promising technique, offering non-inferior efficacy to conventional thermal methods while potentially minimizing the thermal damage to adjacent tissues associated with RFA and CBA. Despite higher equipment costs, the advantages of PFA in reducing complications highlight its potential role in AF management. However, considering the novelty of PFA, no data currently exist comparing this strategy with thermal techniques. Therefore, further research is needed to improve the management of AF and patient outcomes to reduce healthcare burdens.
Obstructive sleep apnea (OSA) is highly prevalent in patients with acute coronary syndrome (ACS). The triglyceride glucose (TyG) index is considered closely linked to cardiovascular risk. However, the relationship between OSA, TyG index, and cardiovascular outcomes in ACS patients remains unclear. Hence, this study aimed to examine the effects of OSA and the TyG index on cardiovascular outcomes in ACS patients.
This post-hoc analysis included 1853 patients from the OSA–ACS project, a single-center prospective cohort study that enrolled ACS patients admitted between January 2015 and December 2019. OSA was defined as an apnea–hypopnea index of ≥15 events/hour. The primary endpoint was major adverse cardiovascular and cerebrovascular events (MACCE). Multivariable Cox regression models were used to evaluate the impact of OSA on cardiovascular events across the TyG index categories.
OSA was present in 52.5% of the participants, with a mean TyG index of 9.02 ± 0.68. Over a median follow-up of 35.1 (19.0–43.5) months, OSA was significantly associated with a heightened risk of MACCE (adjusted hazard ratio (aHR): 1.556; 95% confidence interval (CI): 1.040–2.326; p = 0.031) in the high TyG group within the fully adjusted model, along with elevated risk of hospitalization for unstable angina (aHR: 1.785; 95% CI: 1.072–2.971; p = 0.026). No significant associations were observed between OSA and MACCE in the low and moderate TyG groups.
This analysis demonstrates that OSA significantly increases the risk of adverse cardiovascular events in ACS patients with a high TyG index, underscoring the importance of routine OSA screening in these high-risk ACS patients to optimize cardiovascular risk stratification and personalize treatment strategies.
NCT03362385, https://clinicaltrials.gov/expert-search?term=NCT03362385.
Heart rate variability (HRV) analysis is a noninvasive tool that allows cardiac autonomic control to be assessed. Numerous studies have reported HRV measurements, related changes, and clinical implications for heart failure patients. This review evaluates HRV characteristics in congestive heart failure (CHF), focusing on different recording durations and the diagnostic and prognostic values using HRV measurements. The recording durations are classified as (a) ultra short-term (substantially shorter than 5 minutes), (b) short-term (5 minutes), and (c) long-term (nominal 24 hours). This review of HRV diagnostic and prognostic significance in CHF focuses on time- and frequency-domain HRV measures that have previously been extensively studied. Reported studies document that HRV is lowered in CHF patients, whereas HRV increases may indicate disease improvement, e.g., in CHF patients undergoing cardiac resynchronization therapy. Reduced HRV has consistently been found to be associated with all-cause mortality in CHF patients. However, different thresholds of long-term HRV indices have been proposed as mortality predictors; meanwhile, findings related to the prediction of other cardiac events, including sudden cardiac death, remain inconsistent. HRV is reduced in CHF patients, but the use of HRV as a risk factor remains controversial, with no established cut-off values. HRV does not provide a clinically useful prediction of sudden cardiac death or other cardiac events in CHF patients. Thus, we advocate standardization of investigative protocols based on the existing time- and frequency-domain HRV indices rather than further developing more complex methods. Short-term recordings are preferable for clinical application and measurement reproducibility; thus, future investigations should focus on the following key questions:
1. How to design standardized short HRV tests suitable for outpatient settings?
2. Which HRV indices should be preferred, and what are their optimal prognostic thresholds?
3. How to standardize HRV assessment conditions to minimize external influences?
The association between low-density lipoprotein cholesterol (LDL-C) levels and the risk of hemorrhagic stroke (HS) detected through different blood pressure statuses remains unclear. Hence, we systematically evaluated the association between LDL-C and HS in populations with and without hypertension.
We searched PubMed, Cochrane Library, and Embase databases for articles written in English. Only prospective design or randomized controlled trials (RCTs) reporting effect estimates with 95% confidence intervals (CIs) for the relationship between LDL-C and HS were included. We pooled risk ratios (RRs) stratified by blood pressure status and dose–response analyses with a two-stage generalized least squares for trend estimation (GLST) model. Finally, we compared the lower and optimal groups to find the effect of very low LDL-C levels on the risk of HS.
We included seven randomized controlled trials and 9 prospective cohort studies involving 304,763 participants with 2125 (0.70%) HS events. The non-linear trend suggested that LDL-C levels of approximately 80 mg/dL among hypertensive patients and 115 mg/dL among non-hypertensive patients had the lowest risk of HS. Meanwhile, continually lowering LDL-C levels under the optimal (80 mg/dL for hypertensive patients and 115 mg/dL for non- hypertensive patients) LDL-C level would increase the risk of HS in the hypertensive population (RR = 1.84, 95% CI: 1.36–2.50) but not in the non-hypertensive population (RR = 1.15, 95% CI: 0.97–1.36).
The risk of HS can be effectively reduced by controlling LDL-C levels to 60–80 mg/dL in the hypertensive population and 115 mg/dL in the non-hypertensive population. The safety range of controlling LDL-C levels to protect against HS among hypertensive patients is narrower than that among the non-hypertensive population. Additionally, controlling blood pressure might play a positive role in safeguarding against HS by lowering LDL-C levels.
Atrial fibrillation (AF), the most common sustained cardiac arrhythmia, poses significant challenges due to high morbidity, mortality, and healthcare costs. Pulmonary vein isolation (PVI) is a cornerstone treatment that disrupts arrhythmogenic pathways through electrically isolating pulmonary veins. However, recurrence rates remain substantial, driven by complex demographic, biochemical, imaging, and electrocardiographic factors reflecting underlying pathophysiologies. Advancements in PVI techniques, including pulsed-field ablation and electroanatomic mapping, have improved procedural success. Antiarrhythmic drugs (AADs) enhance outcomes by stabilising atrial activity and reducing early recurrence, although the long-term benefits of these drugs are debated. Nonetheless, integrating these predictors into patient selection, procedural strategies, and post-ablation management enables personalised interventions. This review uniquely integrates demographic, biochemical, imaging, electrocardiographic, and procedural predictors into a multidimensional framework for comprehensive risk stratification of PVI outcomes. We critically evaluate emerging procedural techniques, notably pulsed-field ablation (PFA), emphasising the clinical applicability of these procedures. Key biochemical markers (e.g., N-terminal pro-brain natriuretic peptide (NT-pro-BNP), C-reactive protein (CRP), interleukin-6 (IL-6)) and imaging findings (e.g., left atrial fibrosis, epicardial fat) reflecting atrial pathophysiology are discussed in detail. Furthermore, readily accessible electrocardiographic parameters such as prolonged P wave duration and dispersion are emphasised as practical tools for patient risk assessment. This multidimensional approach holds promise for reducing AF recurrence and improving long-term outcomes in PVI, advancing patient-centered care in AF management.
Prosthetic heart valves are crucial for treating valvular heart disease and serve as substitutes for native valves. Bioprosthetic heart valves (BHVs) are currently the most common type used in clinical practice. However, despite the long history of use, challenges remain in clinical applications, notably via valve calcification, which significantly affects longevity and quality. The mechanisms through which calcification occurs are complex and not yet completely understood. Therefore, this paper aims to provide a comprehensive review of developments in prosthetic valves, focusing on the calcification processes in bioprosthetic heart valves and the biological, chemical, and mechanical factors involved. In addition, we highlight various anti-calcification strategies currently applied to BHVs and assess whether anti-calcification approaches can prolong valve durability and improve patient prognosis. Finally, we describe the imaging methods presently used to monitor calcification clinically. Advances in nanotechnology and tissue engineering may provide better options for mitigating prosthetic heart valve calcification in the future.
Atrial fibrillation (AF) is a common and serious arrhythmia that frequently complicates cardiac amyloidosis (CA), a rare condition characterized by amyloid deposits in the heart. The coexistence of AF in CA patients significantly increases the risk of heart failure, stroke, and other life-threatening complications; however, the therapeutic approach to managing AF in CA patients remains underexplored. Thus, this review discusses the features of AF in CA patients, recent research on the development of effective treatment options, and strategies for future therapies. A comprehensive review of the literature was conducted, assessing the epidemiology of AF in CA, the challenges in treatment, and the available intervention strategies, with a particular emphasis on catheter ablation and anticoagulation therapy. AF is highly prevalent in CA patients, with incidence rates reaching 88%. The presence of amyloid deposits exacerbates the risk of arrhythmias, leading to increased morbidity and mortality. Traditional risk stratification models, such as the Congestive Heart Failure, Hypertension, Age ≥75 [Doubled], Diabetes Mellitus, Prior Stroke or Transient Ischemic Attack [Doubled], Vascular Disease, Age 65–74, Female (CHA2DS2-VASc) score, have limited effectiveness in CA patients. Anticoagulation therapy, particularly direct oral anticoagulants, is recommended to prevent thromboembolic events, though individualized risk assessment is crucial. Catheter ablation has shown promise in improving outcomes, including reducing hospitalization rates and mortality. However, the benefits of catheter ablation remain controversial in light of recent studies suggesting potential risks such as prolonged hospital stays and higher economic burdens. AF is a significant and often fatal complication of CA. The CHA2DS2-VASc score has limitations in assessing thrombotic risk in CA patients; meanwhile, speckle-tracking echocardiography (STE) has been shown to indirectly predict the danger of thrombosis in these patients. Therefore, the effect of conducting STE on CA patients needs to be further validated. While current therapies, including anticoagulation and catheter ablation, offer some benefits, their effectiveness remains uncertain due to the complexity of the pathophysiology of CA and limited high-quality studies. Future research should focus on developing amyloid-targeted therapies and conducting randomized trials to optimize AF management in CA patients to improve survival and quality of life.
The relationship between metabolic status as a possible risk factor and predictor of response to moderate-to-vigorous physical activity (MVPA) in atrioventricular block (AVB) remains unclear.
A total of 82,365 UK Biobank participants without a history of AVB or pacemaker implantation, and who were involved in accelerometer work-up, were chosen for the study population. Metabolic status was classified into two categories, healthy and unhealthy, using modified criteria for metabolic syndrome from the International Diabetes Federation. We used the multivariable Cox proportional model to assess the associations between metabolic status and primary outcome (composite of second-degree AVB or third-degree AVB) or secondary outcomes (each component in the primary outcome and AVB-related pacemaker implantation). The relationship between MVPA min/week and the primary outcome in each metabolic status category was assessed using restricted cubic splines.
Of the 82,365 participants, the mean age was 62.3 years, and 44.1% were men. In total, 299 primary outcome events occurred during the 6.1-year follow-up. Compared to metabolically healthy participants, metabolically unhealthy participants had a 58% higher risk of the primary outcome (hazard ratio (HR): 1.58, 95% confidence interval (CI): 1.25–2.00; p < 0.001). This pattern was consistent for second-degree AVB (HR: 1.59, 95% CI: 1.12–2.27; p = 0.010), third-degree AVB (HR: 1.50, 95% CI: 1.12–2.03; p = 0.008), and AVB-related pacemaker implantation (HR: 2.25, 95% CI: 1.44–3.52; p < 0.001). Increased MVPA provided statistically significant protection against the primary outcome only in metabolically unhealthy participants, with a threshold of 830 min/week.
Generally, in the middle-aged population, metabolically unhealthy participants had a statistically significantly higher risk of second- or third-degree AVB and AVB-related pacemaker implantation than metabolically healthy participants. However, MVPA reduced the risk of second- or third-degree AVB in the metabolically unhealthy participants, though the effect was attenuated with excessive MVPA. From this perspective, identifying and encouraging exercise in metabolically unhealthy individuals is essential. Due to its observational nature, future research should verify the preventive effects of increased MVPA on conduction block in populations with metabolic abnormalities through randomized controlled trials. Moreover, the biological mechanisms and safety of the protective effects of excessive MVPA require further verification.
Hypertension is a major risk factor for cardiovascular diseases (CVDs) and is closely related to metabolic abnormalities. The cardiometabolic index (CMI) integrates lipid profiles and anthropometric indicators, reflecting overall cardiometabolic health. However, the CMI and blood pressure (BP) relationship is poorly understood. Therefore, this study aimed to investigate the correlation between CMI and clinical BP and evaluate the potential of using this correlation as a cardiovascular risk indicator.
National Health and Nutrition Examination Survey (NHANES) data from 2015 to 2018 were used to calculate the CMI based on the triglycerides to high-density lipoprotein cholesterol ratio and the waist-to-height ratio. The relationship between CMI and systolic blood pressure (SBP)/diastolic blood pressure (DBP) was analyzed using multivariate regression, threshold effect analysis, and subgroup analysis.
In this study cohort of 4240 participants, CMI positively correlated with SBP and DBP. After adjusting for age, gender, and race, the partial correlation for SBP was 0.56 (95% CI: 0.19–0.93; p < 0.01), while for DBP, it was 1.15 (95% CI: 0.60–1.71; p < 0.001). The threshold effect analysis revealed a positive association with SBP when the CMI was below 6.83 (β = 1.44, 95% CI: 0.64–2.24; p < 0.001) and a negative association when the CMI was above 6.83 (β = –1.52, 95% CI: –2.77– –0.28; p = 0.0123). For the DBP, a positive correlation was found when the CMI was below 2.81 (β = 1.45, 95% CI: 0.10–2.79; p = 0.0345), and a negative correlation when the CMI was above 2.81 (β = –1.92, 95% CI: –3.08– –0.77; p = 0.0012). A strong interaction was observed between the CMI and gender for the SBP (p = 0.0054) and a trend for the interaction between CMI and age for the DBP (p = 0.1667).
This study found a significant positive correlation between the CMI and BP, with threshold effects supporting a non-linear relationship. The strong interaction between the CMI and gender for SBP suggests that the influence of the CMI on BP may be gender-dependent. These results highlight the importance of utilizing CMI in personalized cardiovascular risk stratification and underscore the relevance of considering patient factors such as gender in managing hypertension.
Recent advancements in computed tomography have significantly transformed the clinical application of this technique in diagnosing and managing coronary artery disease (CAD). Computed tomography coronary angiography (CTCA) has emerged as a leading non-invasive imaging modality, often serving as the first-line investigation to exclude obstructive CAD in patients with chronic coronary syndrome. Beyond its utility in diagnosing CAD, CTCA has become instrumental in procedural planning for percutaneous coronary intervention (PCI), particularly in complex cases such as left main stem (LMS) interventions, where peri-procedural risks are elevated. This review highlights the evolving role of CTCA in LMS PCI, underscoring its clinical utility in improving procedural precision and, subsequently, patient outcomes. Recent technological advancements, including detailed multiplanar and three-dimensional (3D) reconstructions, CT-derived fractional flow reserve (CTFFR), and the integration of artificial intelligence (AI) algorithms, have expanded the capabilities of CTCA. These innovations allow for comprehensive anatomical and functional assessments, enabling precise plaque morphology, lesion complexity, and bifurcation anatomy evaluations alongside PCI simulations. By offering detailed insights into coronary vasculature and lesion characteristics, CTCA provides critical information for optimising LMS PCI strategies. This review explores the current applications and future potential of CTCA in guiding LMS PCI, highlighting its role in improving procedural planning, risk assessment, and overall management of this challenging patient population.
Cardiovascular diseases are a leading cause of mortality worldwide. Physical activity is linked with a reduced prevalence of cardiovascular diseases. However, excessive over-volume of training could negatively increase the risk of cardiovascular diseases. Prediction models are usually derived to facilitate decision-making and may be used to precisely adjust the intensity of physical activity and stratify individual exercise capacity. Incorporating prediction models and knowledge of risk factors of cardiovascular diseases allows for the accurate determination of risk groups among athletes. Due to the growing popularity of amateur physical activity, as well as the high demands for professional athletes, taking care of their health and providing precise pre-participation recommendations, return-to-play guidelines or training intensity is a significant challenge for physicians and fitness practitioners. Athletes with confirmed or suspected cardiovascular disease should be guided to perform training in carefully adjusted safe zones. Indirect prediction algorithms are feasible and easy-to-apply methods of individual cardiovascular disease risk estimation. Current knowledge about the usage of clinical forecasting scores among athletic cohorts is limited and numerous controversies emerged. The purpose of this review is to summarize the practical applications of the most common prediction models for maximal oxygen uptake, cardiac arrhythmias, hypertension, atherosclerosis, and cardiomyopathies among athletes. We primarily focused on endurance disciplines with additional insight into strength training. The secondary aim was to discuss their relationships in the context of the clinical management of athletes and highlights key understudied areas for future research.
The electrocardiogram (ECG) screening in athletes is essential due to the unique cardiac adaptations induced by intensive training. However, differentiating between physiological adaptations and pathological abnormalities remains a significant challenge, particularly when considering variations across different sports, ages, and genders.
A systematic review of observational studies published between 2015 and 2025 was conducted according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. Data were extracted from 20 studies examining ECG changes in athletes across endurance, strength, and mixed sports, encompassing both adolescent and adult populations.
Commonly observed ECG changes included increased QRS amplitude, T-wave inversions, and sinus bradycardia, particularly in endurance athletes, while strength-based athletes frequently exhibited left ventricular hypertrophy. Male athletes showed higher QRS voltages, longer QRS durations, and higher PR intervals, whereas female athletes demonstrated elevated resting heart rates and prolonged corrected QT interval (QTc) intervals. Adolescents who engaged in regular sports displayed fewer abnormal ECG findings than adults; however, high-intensity training in adolescent athletes was associated with right atrial enlargement and increased P-wave duration. Detraining effectively reversed certain ECG changes, including prolonged QT intervals and T-wave abnormalities, though these changes often reappeared upon resumption of intense training. Notably, de novo ECG abnormalities, such as T-wave inversions and ST-segment depression, were identified in athletes with post-COVID-19 infections. This review also highlights the financial burden of widespread ECG screening, but reinforces the importance of ECG screening in preventing sudden cardiac death (SCD) through comprehensive cardiac evaluations.
This review emphasizes the complexity of ECG interpretation in athletes, highlighting sport-specific, gender-based, and age-related variations. The persistent high false-positive rates underscore the need for refined, sport-specific ECG guidelines. Recent recognition of sports medicine as a primary specialty within the European Union (EU) reinforces the importance of comprehensive physician training. Integrating artificial intelligence (AI) technology into ECG screening can enhance diagnostic accuracy, reduce costs, and facilitate large-scale implementation. Meanwhile, collaborative efforts among clinicians, researchers, and policymakers are essential to developing cost-effective and standardized ECG screening protocols, ensuring improved athlete care, and advancing the field of sports cardiology.
Heart failure (HF) is a complex clinical syndrome resulting from impaired myocardial function or structure, affecting approximately 56 million patients worldwide. Cardiometabolic risk factors, including hypertension, insulin resistance, obesity, and dyslipidemia play a pivotal role in both the pathogenesis and progression of HF. These risk factors frequently coexist as part of cardiometabolic syndrome and contribute to widespread organ and vascular dysfunction, leading to conditions such as coronary artery disease, chronic kidney disease, type 2 diabetes mellitus, non-alcoholic fatty liver disease, and stroke. Emerging evidence suggests that these conditions not only increase the risk of developing HF, but also negatively impact its progression and outcome. As the global burden of cardiometabolic disease continues to rise, a growing number of HF patients will exhibit multiple metabolic comorbidities. Understanding the intricate relationship between cardiometabolic risk factors and diseases and their impact on HF outcomes is therefore crucial for identifying novel therapeutic avenues. A more integrated approach to HF prevention and management—one that considers these interconnected cardiometabolic factors—offers significant potential for improving patient outcomes.
The tricuspid valve (TV) is a complex three-dimensional (3D) anatomical structure; however, current guidelines recommend tricuspid annulus (TA) measurements to be performed with two-dimensional (2D) echocardiography. The aim of this study was to compare TV measurements obtained with 2D and four-dimensional (4D) echocardiography for surgical planning.
All echocardiographic data of patients referred to our center for TV assessment were collected. Multimodality imaging data were reviewed, including 2D transthoracic echocardiography (TTE) integrated with information from 3D TTE. Measurements were also compared with those obtained using the 4D Auto Tricuspid Valve Quantification (TVQ) tool.
Overall, 11 patients (median age 72 [66–78] years, 18% female) were included in the study. Mild, moderate and severe tricuspid regurgitation (TR) was present in 6, 3 and 2 patients, respectively. Systolic pulmonary artery pressure was 35 ± 8 mmHg, inferior vena cava diameter 21 ± 4 mm, right atrial area 25 ± 9 cm2, 4D ejection fraction 45 ± 7%, 4D fractional area change 40 ± 6%, and tricuspid annular plane systolic excursion 21 [15–25] mm. 2D/4D right ventricular-basal diameter (RVD1) was significantly different (p < 0.005). Similarly, 2D/4D right ventricular diameter measured at the level of the left ventricular papillary muscles (RVD2) was significantly different (p < 0.012), as well as 2D/4D tricuspid annular diameter (p = 0.020). Despite these differences, a strong correlation between variables was observed (Spearman correlation coefficient >0.824). In evaluating the correlation between TR severity and analyzed variables, RVD1 was related to TR severity both at 2D and 4D echocardiography. Conversely, RVD2 and TA diameter were significantly associated with TR severity only at 4D echocardiography.
Our results suggest that specific patient subsets could benefit more from TA measurements using the 4D Auto TVQ tool to help identify the mechanisms responsible for TR, including candidates for left-sided valve surgery and patients in whom the indication for TV repair is unclear.