Antimicrobial resistance remains a global challenge that is not under control. Countries from around the world, including the UK, have developed action plans to counteract this silent pandemic. In the UK, such action plans include antimicrobial stewardship with strategies on early intravenous (IV) to oral antibiotic delivery switch. However, it is evident that despite all this guidance, there are still various barriers or myths preventing the switch. A literature search of studies and other reviews from the past 25 years on the topics of antimicrobial Intravenous-to-Oral Switch (IVOS) was conducted. The literature was reviewed and presented thematically to address perceived myths regarding IVOS. Several studies show that there are various reasons why early IVOS is restricted. Some of these beliefs, or myths, are shared between both the patients and clinicians, such as that IV antimicrobials are superior to the oral options. Some of these barrier beliefs stem from gaps in knowledge about the pharmacology of antibiotics and microbiology, leading to unnecessary IV therapy for resistant organisms. Excessive reliance on inflammatory markers exclusively to measure the severity of an infection is another barrier. Other common myths amongst clinicians are that IV antimicrobials are safer for patients, have no environmental impact, and that they have minimal impact on the clinical team and healthcare organisation. The fear of litigation from the patients for switching early, as well as the hierarchical system for decision making, are other limitations. Although IVOS is not for everyone, it is very evident that there is a lack of awareness about the existing guidance and the risks of not switching when appropriate. All of this is reflected in the beliefs and myths shared by the prescribing clinicians, and more needs to be done to change these views.
Meibomian gland dysfunction (MGD)-related dry eye results in reduced tears, unstable tear film, ocular surface damage, and even causes keratitis. Demodex infection causes inflammatory diseases of the ocular surface. This study aimed to explore the impact of Demodex infection on the occurrence and prognosis of keratitis in patients with MGD-related dry eye.
A total of 122 MGD patients who visited the Department of Ophthalmology, The First People’s Hospital of Chun’an County from June 2022 to June 2023 were selected for retrospective study. Patients were divided into keratitis group (n = 65) and non-keratitis group (n = 57) according to the presence of keratitis. MGD patients with keratitis were followed up for one year, and divided into good prognosis group (n = 36) and poor prognosis group (n = 29) based on their prognosis. The Demodex infection status and ocular surface parameters were detected in these two groups of patients. Logistic regression was adopted to analyze the influencing factors of keratitis complicated with MGD.
The positive rate of Demodex infection in the keratitis group was 81.5%, which was significantly higher than that in the non-keratitis group (35.1%, p < 0.05). Compared with non-keratitis group, keratitis group had reduced tear film break-up time (BUT), and increased corneal fluorescein staining (CFS), meibomian gland (MG) dropout, and plugging of MG orifices (p < 0.05). Logistic multivariate regression analysis showed that Demodex infection (odds ratio [OR]: 6.209, 95% confidence interval [CI]: 2.101–18.348), CFS (OR: 2.627, 95% CI: 1.562–4.416) and plugging of MG orifices (OR: 3.174, 95% CI: 1.616–6.235) were independent risk factors for keratitis in patients with MGD-related dry eye (p < 0.05), and BUT (OR: 0.768, 95% CI: 0.606–0.972) was a protective factor (p < 0.05). The age and MG expression in the good prognosis group were lower than those in the poor prognosis group (p < 0.05). No significant difference was observed between the good prognosis and poor prognosis groups in patients with Demodex infection (p > 0.05).
The positive rate of Demodex infection is higher in patients with MGD combined with keratitis. Demodex infection, CFS and plugging of MG orifices are independent risk factors for keratitis, while the tear film BUT is a protective factor in MGD-related dry eye patients. Demodex infection does not affect the prognosis of keratitis.
The dawn phenomenon (DP), characterized by spontaneous morning hyperglycemia in type 2 diabetes mellitus (T2DM), may exacerbate post-breakfast glucose excursions. This study investigated the association between DP and postprandial hyperglycemia, referred to as the “extended dawn phenomenon”.
In this cross-sectional study, 500 T2DM patients (glycated hemoglobin A1c (HbA1c) <7.5%) were recruited from Huadong Hospital, Fudan University between 2021 and 2023. A total of 40 participants were excluded due to incomplete data, resulting in 460 patients for final analysis. All participants underwent continuous glucose monitoring (CGM) and were stratified by the magnitude of DP (δDawn, defined as the difference between fasting glucose and nocturnal nadir glucose, with a threshold of ≥1.11 mmol/L). They were then matched by fasting glucose. Glycemic profiles, including peak post-breakfast glucose and time-in-range (TIR; percentage of time within the target glucose range of 3.9–10.0 mmol/L), were compared between groups. Multivariable logistic regression was used to identify determinants of post-breakfast hyperglycemia.
Despite comparable fasting glucose levels, patients with DP exhibited higher peak post-breakfast glucose (median (interquartile ranges [IQR]): 9.7 [8.2–10.7] vs 8.9 [8.0–10.0] mmol/L, p = 0.02) and reduced TIR in the overall cohort (94.1% [85.8–100.0] vs 100.0% [92.3–100.0], p < 0.001), although this difference attenuated after matching (p = 0.133). δDawn independently predicted post-breakfast hyperglycemia (odds ratio (OR) = 1.591, 95% confidence interval (CI): 1.283–1.993, p < 0.001), along with HbA1c (OR = 2.322, 95% CI: 1.530–3.566, p < 0.001), homeostatic model assessment for insulin resistance (HOMA-IR) (OR = 1.308, 95% CI: 1.110–1.548, p = 0.001), and homeostasis model assessment of β-cell function (HOMA-β) (OR = 0.990, 95% CI: 0.983–0.997, p = 0.004).
DP contributes to prolonged postprandial hyperglycemia, underscoring its role as a potential therapeutic target for optimizing glycemic control in T2DM.
Lumbar spine disease frequently occurs in middle-aged and elderly individuals, significantly affecting their physical and mental health. Posterior lumbar interbody fusion (PLIF) is widely used for treating these diseases; however, postoperative complications may lead to fatigue. Therefore, we explore the effect of different postoperative doses of esketamine on postoperative fatigue syndrome (POFS) in patients undergoing PLIF.
This retrospective study analyzed the clinical data of 105 patients undergoing PLIF between June 2022 and September 2024. Patients were divided into high-dose esketamine group (Group HS, 30 cases), low-dose esketamine group (Group LS, 35 cases) and control group (Group C, 40 cases) based on the preoperative dosages of esketamine. Furthermore, intraoperative sufentanil usage, heart rate (HR) and mean arterial pressure (MAP) were assessed at different times during surgery. Recovery conditions and the length of hospitalization were compared across the three groups. Fatigue levels and psychological status, evaluated using the Christensen fatigue scale, self-rating depression scale (SDS), and self-rating anxiety scale (SAS), were assessed preoperatively and on postoperative days 3 and 5. Pain intensity was assessed using the numeric rating scale (NRS) at 30 min after awakening and on postoperative days 1 and 2. The incidence of POFS and adverse drug reactions (ADRs) was also analyzed among the three groups.
Compared to T0, MAP and HR scores decreased significantly across all groups from T1 to T3 (p < 0.05). From T1–T3, MAP and HR levels were significantly reduced in Group HS than the other groups, while Group LS demonstrated lower levels than Group C (p < 0.05). Intraoperative sufentanil usage was also substantially lower in Group HS than the other groups, with Group LS using less than Group C (p < 0.05). Upon leaving the recovery room, Group HS had a significantly higher recovery score than the other groups (p < 0.05), with Group LS scoring higher than Group C (p < 0.05). On postoperative day 3 and 5, Group HS exhibited lower SDS and SAS scores than the other groups (p < 0.05), while Group LS exhibited lower scores than Group C (p < 0.05). Furthermore, fatigue scores were considerably lower in Group HS compared to the other groups (p < 0.05), whereas the difference between Group C and Group LS did not achieve statistical significance (p > 0.05). The incidence of POFS in Group HS was 46.67%, significantly lower than in Group LS (88.57%) and Group C (92.50%, p < 0.05). On postoperative days 1 and 2, Group HS had lower NRS scores, followed by Group LS, with Group C exhibiting the highest scores (p < 0.05). Additionally, there were no significant differences among the three groups in awakening time, extubation time, length of hospitalization, and incidence of ADR (p > 0.05).
Preoperative intravenous esketamine administration effectively prevents POFS in middle-aged and elderly patients undergoing PLIF under general anesthesia in the prone position. An esketamine dose of 0.5 mg/kg shows superior efficacy in reducing postoperative anxiety, depression, and pain levels.
Patients with advanced lung cancer experience substantial symptom burden and emotional distress, highlighting the need for improved palliative models. This retrospective study aimed to analyze the impact of a multidisciplinary collaborative palliative care model on the quality of life and medical costs in patients with advanced lung cancer.
This retrospective study enrolled 280 patients with advanced lung cancer who received palliative care at Wenzhou Central Hospital. Patients were divided into two groups based on the care they received: a group receiving routine palliative care (routine care group) and another group receiving multidisciplinary collaborative palliative care (multidisciplinary care group). Changes in depression, anxiety, quality of life scores [Functional Assessment of Cancer Therapy-Lung (FACT-L), Functional Assessment of Cancer Therapy-Lung Cancer Subscale (FACT-LCS), Functional Assessment of Cancer Therapy-Lung total outcome index (FACT-L TOI)], sleep quality score [Pittsburgh Sleep Quality Index (PSQI)], overall well-being score [General Well-Being (GWB) scale], and medical costs (daily and total costs) were compared between the two groups before and after the intervention.
After the intervention, the multidisciplinary care group had significantly lower symptoms of depression and anxiety compared to the routine care group (p < 0.05); significantly higher scores in all dimensions of FACT-L, FACT-LCS, and FACT-L TOI (p < 0.05); and a significantly lower PSQI score and a considerably higher GWB score (p < 0.05). Furthermore, the average daily costs and total medical expenses were lower in the multidisciplinary care group than in the routine care group (p < 0.05).
A multidisciplinary collaborative palliative care model can significantly reduce negative emotions, enhance quality of life and overall well-being, improve sleep quality, and effectively reduce medical expenses in patients with advanced lung cancer, demonstrating high clinical application value.
Penile cancer (PeCa) is a rare but preventable malignancy that predominantly affects elderly men. Incidence is rising in high-income countries, and mortality is particularly high in those aged 75 years and older. The major, synergistic risk factors are male genital lichen sclerosus (MGLSc) and persistent infection with high-risk human papillomavirus (HPV). This narrative review examines the peer-reviewed evidence on PeCa’s epidemiology, aetiopathogenesis, diagnosis, management, and prevention, with a focus on MGLSc and high-risk HPV, and their implications for disease in older men. Priority was given to high-impact studies, and recent advances relevant to clinical practice. PeCa develops via two principal pathways: an HPV-dependent route, typically leading to undifferentiated penile intraepithelial neoplasia (uPeIN) and related squamous cell carcinoma (SCC) subtypes; and an HPV-independent route, driven by chronic inflammation and scarring from MGLSc, leading to differentiated PeIN (dPeIN) and SCC. Additional modifiable risk factors include phimosis, smoking, and poor genital hygiene. Diagnosis relies on careful clinical examination, dermatoscopy, and histopathology. Management ranges from topical therapy and circumcision to organ-sparing surgery, lymphadenectomy, systemic chemotherapy, and emerging immunotherapies. Prognosis is closely related to lymph node involvement. Preventive strategies, particularly early diagnosis and treatment of MGLSc and PeIN, HPV vaccination, circumcision, and smoking cessation, could substantially reduce disease burden. PeCa remains an under-recognised malignancy in older men despite being largely preventable. Improved public awareness, timely diagnosis of precursor conditions, and broader uptake of preventive interventions are essential to reverse current trends in incidence and mortality.
Gamification has emerged as an innovative pedagogical strategy with considerable potential to enhance competency development in health professions education. However, its systematic integration into pharmacy clinical skills training remains underexplored. This review systematically synthesises current evidence on the design principles, theoretical foundations, and educational outcomes of gamified learning in pharmacy curricula. Particular emphasis was placed on its influence on learner motivation, knowledge retention, clinical reasoning, and interprofessional collaboration. A comprehensive literature search encompassing pharmacy, medical, and health sciences education was conducted to identify core components and best practice frameworks for effective gamification. Evidence indicates that well-designed gamification models can significantly improve learner engagement and promote the meaningful integration of theoretical knowledge with authentic clinical practice. Nonetheless, several barriers, including limited instructional resources, inadequate faculty expertise, underdeveloped evaluation tools, and ethical concerns related to data security and digital equity, continue to constrain widespread implementation. This review proposes actionable strategies such as precise alignment of gamification elements with course objectives, structured faculty capacity building, enhancement of technological infrastructure, adoption of multidimensional evaluation systems, and strengthened interprofessional collaboration. Future research should focus on adaptive, artificial intelligence (AI)-driven gamification approaches, evaluate their long-term impacts on clinical competence, and develop standardised, evidence-based guidelines for sustainable integration, as outlined in the proposed frameworks within this review.
Anticoagulation using vitamin K antagonists or direct oral anticoagulants is an established treatment option for stroke prevention in patients with atrial fibrillation (AF). Although AF is common in patients with end-stage kidney disease (ESKD), the role of anticoagulation in stroke prevention has not been established in this group of patients. Major clinical trials have excluded patients with advanced kidney disease and this explains the significant lack of evidence-based guidelines to aid clinical decisions in the management of AF in patients with ESKD. Results from smaller studies and meta-analyses involving ESKD patients have not shown any significant advantage of using anticoagulants in preventing thromboembolic events. Moreover, anticoagulation has been associated with a higher risk of significant bleeding in dialysis patients. Therefore, caution and individualised treatment plans are suggested when considering ESKD patients’ anticoagulation. Ongoing clinical trials might illuminate this situation and help formulate more definitive guidance for anticoagulation use in ESKD patients. In summary, insufficient and inconclusive data, which results in the lack of evidence-based guidelines and the unique hemostatic paradox intrinsic to ESKD, muddle the management of AF in ESKD. This underscores the need to consolidate and synthesize data from past and ongoing studies to compare existing treatment options and identify gaps in knowledge that can direct further studies.
Cerebrospinal fluid (CSF) is a crucial type of biological specimen for neurosyphilis diagnosis. This study aims to comprehensively analyse the clinical manifestations and risk factors of adverse reactions in syphilis patients following lumbar puncture (LP).
A total of 573 syphilis patients who underwent LP examinations between August 2023 and April 2024 at the Department of Sexually Transmitted Diseases, Shanghai Skin Disease Hospital, were selected. The clinical manifestations, onset timing, risk factors, and duration of relief from adverse reactions in these patients undergoing LP were statistically analysed.
Among 573 patients diagnosed with syphilis, 142 patients (24.8%) experienced adverse reactions following LP. Factors such as age, needle size, symptoms of paralysis, and the duration of postoperative bed rest were significantly associated with adverse reactions after LP (p < 0.05). Furthermore, age >30 years, the use of a 20G needle, and paralysis symptoms were identified as independent protective factors (p < 0.05), whereas bed rest duration >6 hours contributed to increased risk for adverse reactions (odds ratio [OR] = 3.544, p = 0.010). The average onset time for adverse reactions following LP was 14.34 ± 1.54 hours, while the average duration for resolution of these reactions was 5.52 ± 0.26 days postoperatively.
The high incidence of adverse reactions following LP in patients with syphilis poses challenges to restoring quality of life among affected patients. A thorough assessment of risk factors for adverse reactions could inform the selection of appropriate measures to mitigate the occurrence of such reactions.
Patients with acute ST-segment elevation myocardial infarction (STEMI) are at high risk of major adverse cardiovascular events (MACE) even after discharge, underscoring the need to identify reliable biomarkers. Therefore, this study aimed to investigate the predictive performance of serum monomeric C-reactive protein (mCRP) for MACE in patients with acute STEMI at one-year post-discharge.
A retrospective analysis was conducted on 242 patients with acute STEMI who underwent emergency Percutaneous Coronary Intervention (PCI) in Liyang City People’s Hospital between January 2021 and December 2023. Patients were divided into the MACE group (n = 58) and the non-MACE group (n = 184) based on major adverse cardiovascular events. Univariate and binary logistic regression analyses were performed to identify factors influencing MACE events in patients with acute STEMI treated with emergency PCI. Furthermore, predictive performance was assessed using receiver operating characteristic curve (ROC) analysis.
There were no statistically significant differences (p > 0.05) between the groups in terms of age, gender, body mass index (BMI), smoking history, alcohol consumption history, history of hypertension, left ventricular end-diastolic diameter (LVEDD), total cholesterol (TC), triglycerides (TG), B-type natriuretic peptide (BNP), low-density lipoprotein cholesterol (LDL-C), high-density lipoprotein cholesterol (HDL-C), and left ventricular end-diastolic volume (LVEDV). Comparisons of mCRP (within 24 hours of admission), left ventricular ejection fraction (LVEF), and cardiac troponin I (cTnI) showed statistically significant differences (p < 0.05). Monomeric CRP demonstrated a positive correlation with cTnI levels (r = 0.196, p < 0.05). Binary logistic regression analysis identified mCRP, LVEF, and cTnI levels as independent predictors of one-year post-discharge MACE in patients with acute STEMI (p < 0.05). ROC analysis yielded an area under the curve of 0.840 for mCRP (standard error = 0.028; 95% CI, 0.785–0.896, p < 0.001). At the Youden index of 0.50, mCRP demonstrated a sensitivity of 77.59%, and a specificity of 72.83%.
Monomeric CRP exhibits superior predictive performance for MACE in acute STEMI patients at one-year post-discharge.
This study aimed to evaluate the predictive value of the hemoglobin, albumin, lymphocyte, and platelet (HALP) score for pulmonary infections in patients with non-small cell lung cancer (NSCLC) undergoing chemotherapy.
A total of 180 NSCLC patients admitted to Anhui Public Health Clinical Center between January 2021 and December 2023 were enrolled. Patients were divided into an infection group (n = 65) and a non-infection group (n = 115) based on the occurrence of pulmonary infection during chemotherapy. Univariate and multivariate binary logistic regression analyses were performed to identify factors associated with pulmonary infections in NSCLC patients. Pearson correlation analysis was used to examine relationships between variables, and the predictive value of the HALP score was assessed using receiver operating characteristic (ROC) curve analysis.
There were no statistically significant differences between the two groups in age, gender, body mass index (BMI), hypertension, NSCLC subtype, serum potassium and calcium levels, pathological stage, total protein, or white blood cell count (WBC) (p > 0.05). However, significant differences were observed for the presence of diabetes, hemoglobin (Hb), albumin (ALB), platelet count (PLT), lymphocyte count (LYM), and HALP score (p < 0.05). Multivariate logistic regression analysis identified diabetes and HALP as independent predictors of pulmonary infections in NSCLC patients during chemotherapy (p < 0.05). ROC analysis showed that the area under the curve (AUC) for HALP was 0.812 (standard error = 0.031; 95% confidence interval (CI): 0.752–0.873; p < 0.05), with a Youden index of 0.55. At the optimal cutoff value, the sensitivity and specificity were 89.23% and 66.09%, respectively. Patients with HALP scores <17.89 had a significantly higher incidence of pulmonary infection compared to those with HALP scores ≥17.89 (p < 0.05).
The HALP score is a valuable predictor of pulmonary infections in NSCLC patients undergoing chemotherapy and may aid in early risk stratification.
Accurate and standardized diagnostic coding is essential for hospital performance, reimbursement, and the secondary use of health data for management and research. Despite its significance, how diagnostic coding structure and complexity shift over time, and how these trends differ between institutions, remain poorly defined at the health system level. This study aimed to assess long-term trends and inter-hospital variability in diagnostic coding practice in Chinese public hospitals, focusing on the secondary diagnosis number (SDN), International Statistical Classification of Diseases and Related Health Problems, 10th Revision (ICD-10) chapter composition, and diagnostic diversity over a 12-year period.
The retrospective study analyzed inpatient discharge data from two public tertiary hospitals (Hospital A and Hospital B) at three time points: 2012, 2019, and 2024. Key indicators included the mean SDN per admission, ICD-10 chapter-level diagnosis composition, the shannon diversity index, and heatmap-based profiling of coding structure. Non-parametric tests were applied to assess temporal trends and differences between hospitals.
Mean SDN increased significantly in both hospitals between 2012 and 2024 (Hospital A: 1.45 vs. 1.62; Hospital B: 1.37 vs. 1.61; p < 0.01), suggesting progressively deeper documentation of comorbidities and complications. ICD-10 chapter distributions also changed over time, with a gradual increase in chronic disease-related diagnoses and a substantial reduction in non-specific categories, such as symptoms and abnormal findings (p < 0.001). The shannon diversity index showed a modest downward trend, reflecting increasing concentration and standardization in coding patterns. Heatmap analysis further revealed a decrease in inter-hospital variation and a clear convergence toward standardized diagnostic structures over time.
Over the past decade, diagnostic coding practices in Chinese public hospitals have become more comprehensive, more structured, and increasingly standardized. These trends are consistent with the combined effects of policy reforms, institutional learning, and the digitalization of health information systems. Furthermore, we observed that improvement in coding depth and structure is likely to contribute to greater accuracy and reliability of Diagnosis-Related Group (DRG)-based payment systems.
An ageing population in the UK has seen an increased prevalence of chronic kidney disease (CKD), with a higher proportion of old, frail people requiring kidney replacement therapy. This article aims to explore why peritoneal dialysis (PD) may be the optimal dialysis modality for older people, considering factors such as frailty and cognitive impairment. PD offers considerable benefits over haemodialysis, including improved quality of life, benefits to cognition and higher treatment satisfaction. Assisted PD supports the maintenance of treatment for those with physical or cognitive impairments and helps to alleviate caregiver burden. Despite this, challenges such as treatment burden and peritonitis should be thoroughly considered. Overall, PD aligns well with the priorities of older people, emphasising quality of life and autonomy over survival. However, there needs to be improved funding and access to assisted PD programmes going forward to ensure PD is a viable treatment option.
Autoimmune pancreatitis (AIP) is a relatively rare chronic fibroinflammatory disorder of the pancreas caused by autoimmune mechanisms. Patients with this condition generally show a clear response to glucocorticoid therapy. Notably, its clinical and imaging features often resemble those of pancreatic cancer (PC), particularly when AIP presents as a focal mass. In such cases, clinicians may confuse the two diseases. Given that AIP and PC differ considerably with respect to their biological behavior and treatment, diagnostic errors can lead to unnecessary surgery or delayed treatment. Consequently, it is essential to accurately distinguish between these conditions. Recent diagnostic advances, including the application of liquid biopsy and artificial intelligence, are now being evaluated as alternative approaches to conventional diagnostic methods, and may contribute to improving the distinction between AIP and PC. In this review, we summarize the current evidence, outline the clinical profile of AIP, and compare AIP with PC with respect to epidemiological, clinical, serological, imaging, and histopathological dimensions. In addition, we discuss the advantages and limitations of these new diagnostic tools. Furthermore, we propose a practical three-stage diagnostic algorithm based on the present guidelines. This stepwise approach may provide a practical method for integrating routine and emerging tests for evaluating patients with suspected AIP or PC.
Oral health has long been considered to be on the bangs of systemic health. And yet, as recently as 2024, the World Health Organization stated once again that oral diseases rank first in the world in terms of prevalence. This review aims to highlight data from the literature demonstrating the relationships between oral and systemic diseases. The tissues of the oral cavity, teeth, their bony bases and supporting tissues can each suffer from pathologies, often infectious, whose consequences can be found at local, regional and systemic levels. These disorders affect children and adults of all ages. Their management is a public health issue that involves all medical practitioners.
Climate change threatens human health; however, healthcare itself is an important contributor to our changing climate. Emissions of the major greenhouse gas carbon dioxide take place at every stage of a patient’s healthcare journey, presenting numerous mitigation opportunities. Principles of sustainable healthcare should be employed at individual and organisational levels, with a particular emphasis on improving patient health before intervention becomes necessary to reduce overall population demand for healthcare. This necessitates patient empowerment through choice of treatments and lifestyle changes; clinicians should use patient encounters as opportunities for health promotion. Providing high-quality care at the right time, prioritising getting care right the first time and avoiding complications are key to reducing the environmental impacts of healthcare. Anaesthetists have a role in improving healthcare sustainability by improving their climate literacy, following the latest guidance on the most sustainable anaesthetic techniques, and minimising waste of medications, equipment, energy, and water. The Royal College of Anaesthetists has made progress to address environmental sustainability, prioritising both ‘greener’ speciality practice and its own environmental footprint. This article aims to guide clinicians and healthcare organisations in practising sustainable healthcare, with particular focus on anaesthetic practice in the UK.
In-hospital progressive stroke in patients with pre-existing large artery occlusion presents a therapeutic dilemma, particularly when standard reperfusion strategies are unsuitable. This case report aims to illustrate a mechanism-guided medical approach in this complex scenario.
We report the case of a 65-year-old male patient with chronic left vertebral artery occlusion, hospitalised for dizziness, who experienced acute neurological deterioration with a National Institutes of Health Stroke Scale (NIHSS) score ranging from 2 to 14. Magnetic resonance imaging confirmed a new pontine infarction. Endovascular therapy was deferred due to suspected artery-to-artery embolism from an unstable plaque. Instead, argatroban infusion combined with antiplatelet therapy was initiated within 5 h.
The patient showed gradual improvement, and follow-up imaging demonstrated complete thrombus resolution. The NIHSS score was reduced to 4 at discharge.
This case highlights the successful use of a mechanism-guided approach with argatroban for thrombolysis-naïve posterior circulation stroke, effectively balancing the thrombotic and haemorrhagic risks. This suggests that personalised anticoagulation may optimise outcomes in similar complex scenarios in which standard reperfusion is unsuitable.
Brachial plexus birth injury is an uncommon but serious obstetric complication. It follows a highly variable course, with outcomes ranging from mild transient impairment to significant lifelong disability. Infants with brachial plexus birth injuries require careful, regular monitoring to guide decision-making about the treatment and management of the condition. A variety of techniques may be implemented in the conservative management of brachial plexus birth injury. These aim to ensure a full range of motion and to prevent muscle imbalances. Surgical management varies but predominantly involves microsurgical nerve grafting and nerve transfers. This may be followed later by further orthopedic procedures depending on the needs of the patient. Despite the long history of this condition in the medical literature, there continues to be broad variations between the management pathways favored by different treatment centers. International research groups are now working to establish a minimum standard of patient outcome data to facilitate more effective future research on this topic. This article summarizes the current strategies used in the assessment and management of patients with brachial plexus birth injury.
Scaphoid fractures are a common but frequently missed wrist injury. Patients classically present following a fall onto an outstretched hand, with pain and tenderness in the anatomical snuffbox. When missed, scaphoid fractures can have serious complications such as avascular necrosis of the scaphoid and scaphoid non-union advanced collapse. The British Society for Surgery of the Hand has recently published guidelines on the management of suspected scaphoid fractures. Initial management involves radiographs and immobilisation with referral to a specialist within 7 days. Further management includes immobilisation for stable fractures and surgical intervention for cases with greater than 2 mm displacement, associated distal radial fractures, peri-lunate injuries, or proximal pole fractures with any displacement. Follow-up radiographs and computed tomography are used to monitor healing. This article aims to review recent advances in scaphoid fracture management, aligned with the latest British Society for Surgery of the Hand (BSSH) guidelines.
Congenital dermatological diseases (CDD), a complex group of inherited or developmental skin disorders, pose challenges in their management owing to their genetic nature, clinical variability, and socioeconomic impact. Despite the growing body of research, gaps remain in our understanding of research trends, collaborative networks, and translational advancements in this field. This bibliometric study evaluates CDD research over the last three decades (1995–2024) to identify its key developments and Emerging Themes.
A comprehensive bibliometric analysis was conducted using the Scopus database (October 2024), owing to its extensive peer-reviewed coverage and reliable citation tracking. A multistep search strategy was used to refine the dataset and ensure its relevance. Bibliometrix and VOSviewer were used for quantitative and network-based analyses, highlighting publication trends, author impacts, and thematic structures.
CDD research expanded significantly between 1995 and 2024, with 17,984 publications and the majority being published in the last decade. The average age of the documents was 11.6 years, reflecting sustained engagement. Author metrics (h-index, g-index, and m-index) identified leading contributors, while co-authorship and collaboration networks revealed global research dynamics. The co-word analysis outlined the evolving thematic framework, including trends in diagnostic tools and therapeutic innovations.
This study provided a structured overview of CDD research and identified critical gaps in its diagnosis, treatment strategies, and interdisciplinary collaboration. Understanding these trends can inform clinical decision-making, enhance early diagnosis, and support the development of targeted therapies, ultimately improving patient outcomes.
The term “allergy” is often used ambiguously, frequently applied to a broad spectrum of immune and non-immune reactions, which can create confusion in clinical practice. Whilst there is some controversy, the term “allergy” is now primarily reserved for describing immunoglobulin E (IgE)-mediated (Type I) hypersensitivity. Accurate diagnosis requires distinguishing Type I reactions from their mimics, especially given the clinical overlap with non-IgE-mediated conditions. This review focuses on Type I hypersensitivity, detailing its immunopathogenesis, clinical features, and diagnostic strategies, with particular emphasis on the role of serum tryptase in confirming mast cell activation. A comprehensive clinical history is crucial to distinguishing IgE-mediated allergy from its mimics, focusing on timing, symptom reproducibility, and the identification of potential triggers. Cofactors like non-steroidal anti-inflammatory drugs, alcohol, and exercise may exacerbate reactions, complicating diagnosis. Prompt intramuscular adrenaline administration is essential in cases of anaphylaxis, while patient education and specialist referral are key to long-term management. The review also examines conditions that may mimic Type I reactions, such as chronic spontaneous urticaria or isolated angioedema. In addition to classical allergy, clinicians must consider conditions such as hereditary angioedema, neuroendocrine tumours, and drug-induced pseudoallergic reactions. Recognising these mimics is vital to prevent misdiagnosis, ensuring patient safety, and avoiding unnecessary allergy labels or suboptimal management. This article provides a structured framework for evaluating suspected allergy, enhancing diagnostic accuracy, and guiding appropriate, patient-centred care across allergy and related clinical disciplines.
Deep vein thrombosis (DVT) is a life-threatening complication in critically ill patients, and conventional prevention strategies are often hindered by poor adherence. This study aims to evaluate the clinical effectiveness of intelligent nursing interventions based on the Integrated Theory of Health Behavior Change (ITHBC) in preventing DVT among critically ill patients.
A retrospective study was conducted on 280 critically ill patients admitted to the intensive care unit of The Affiliated Xuzhou Municipal Hospital of Xuzhou Medical University between January 2022 and December 2024. Patients were divided into a control group (n = 138), which received routine care, or an observation group (n = 142), which received additional ITHBC-based intelligent nursing interventions. The incidence of DVT, nursing adherence, hematological parameters (D-dimer, prothrombin time (PT), activated partial thromboplastin time (APTT)), complications, and patient satisfaction were compared between groups.
The incidence of DVT was significantly lower in the observation group than in the control group (7.7% vs. 18.1%, p < 0.05). Nursing adherence indicators, including turning frequency, limb exercise compliance, and mechanical prophylaxis use, were markedly higher in the observation group (p < 0.05). After intervention, D-dimer levels decreased significantly in the observation group (p < 0.05), while no significant change was observed in PT or APTT (p > 0.05). No significant differences were found in complication rates (p > 0.05). Patient satisfaction was significantly higher in the observation group (96.5% vs. 87.7%, p < 0.05).
ITHBC-guided intelligent nursing interventions effectively reduced the incidence of DVT, improved patient adherence and satisfaction, and demonstrated strong safety and clinical applicability in critically ill patients.