2024-03-10 2024, Volume 3 Issue 1

  • Select all
  • Melissa De Iuliis , Rayehe Khaghanpour-Shahrezaee , Gian Paolo Cimellaro , Mohammad Khanmohammadi

    Earthquake is one of the natural disasters that affects the buildings and communities in developing countries. It causes different levels of damages to the buildings, making them uninhabitable for a period of time, called downtime (DT). This paper proposes a Fuzzy Logic hierarchical method to estimate the downtime of residential buildings in developing countries after an earthquake. The use of expert-based systems allows quantifying the indicators involved in the model using descriptive knowledge instead of hard data, accounting also for the un-certainties that may affect the analysis. The applicability of the methodology is illustrated using the information gathered after the 2015 Gorkha, Nepal, earthquake as a case study. On April 25, 2015, Nepal was hit by the Mw 7.8 Gorkha earthquake, which damaged and destroyed more than 500.000 residential buildings. Information obtained from a Rapid Visual Damage Assessment (RVDA) is used through a hierarchical scheme to evaluate the building damageability. Sensitivity analysis based on Sobol method is implemented to evaluate the impor-tance of parameters gathered in the RVDA for building damage estimation. The findings of this work may be used to estimate the restoration time of damaged buildings in developing countries and to plan preventive safety measures.

  • Delbaz Samadian , Imrose B. Muhit , Annalisa Occhipinti , Nashwan Dawood

    Traditionally, nonlinear time history analysis (NLTHA) is used to assess the performance of structures under fu-ture hazards which is necessary to develop effective disaster risk management strategies. However, this method is computationally intensive and not suitable for analyzing a large number of structures on a city-wide scale. Surrogate models offer an efficient and reliable alternative and facilitate evaluating the performance of multiple structures under different hazard scenarios. However, creating a comprehensive database for surrogate mod-elling at the city level presents challenges. To overcome this, the present study proposes meta databases and a general framework for surrogate modelling of steel structures. The dataset includes 30,000 steel moment-resisting frame buildings, representing low-rise, mid-rise and high-rise buildings, with criteria for connections, beams, and columns. Pushover analysis is performed and structural parameters are extracted, and finally, incorporating two different machine learning algorithms, random forest and Shapley additive explanations, sensitivity and explain-ability analyses of the structural parameters are performed to identify the most significant factors in designing steel moment resisting frames. The framework and databases can be used as a validated source of surrogate modelling of steel frame structures in order for disaster risk management.

  • Samantha Louise N. Jarder , Osamu Maruyama , Lessandro Estelito O. Garciano

    Losses due to hazards are inevitable and numerical simulations for estimations are complex. This study proposes a model for estimating correlated seismic damages and losses of a water supply pipeline system as an alternative for numerical simulations. The common approach in other research shows average damage spots per mesh estimated statistically independent to one another. Spatially distributed lifeline systems, such as water supply pipelines, are interconnected, and seismic spatial variability affects the damages across the region; thus, spatial correlation of damage spots is an important factor in target areas for portfolio loss estimation. Generally, simulations are used to estimate possible losses; however, these assume each damage behaves independently and uncorrelated. This paper assumed that damages per mesh behave in a Poisson distribution to avoid over-dispersion and eliminate negative losses in estimations. The purpose of this study is to obtain a probabilistic portfolio loss model of an extensive water supply area. The proposed model was compared to the numerical simulation data with the correlated Poisson distribution. The application of the Normal To Anything (NORTA) obtained correlations for Poisson Distributions. The proposed probabilistic portfolio loss model, based on the generalized linear model and central limit theory, estimated the possible losses, such as the Probable Maximum Loss (PML, 90% non-exceedance) or Normal Expected Loss (NEL, 50 % non-exceedance). The proposed model can be used in other lifeline systems as well, though additional investigation is needed for confirmation. From the estimations, a seismic physical portfolio loss for the water supply system was presented. The portfolio was made to show possible outcomes for the system. The proposed method was tested and analyzed using an artificial field and a location-based scenario of a water supply pipeline system. This would aid in pre-disaster planning and would require only a few steps and time.

  • Roman Schotten , Evelyn Mühlhofer , Georgios-Alexandros Chatzistefanou , Daniel Bachmann , Albert S. Chen , Elco E. Koks

    Natural hazards impact interdependent infrastructure networks that keep modern society functional. While a va-riety of modelling approaches are available to represent critical infrastructure networks (CINs) on different scales and analyse the impacts of natural hazards, a recurring challenge for all modelling approaches is the availability and accessibility of sufficiently high-quality input and validation data. The resulting data gaps often require mod-ellers to assume specific technical parameters, functional relationships, and system behaviours. In other cases, expert knowledge from one sector is extrapolated to other sectoral structures or even cross-sectorally applied to fill data gaps. The uncertainties introduced by these assumptions and extrapolations and their influence on the quality of modelling outcomes are often poorly understood and difficult to capture, thereby eroding the reliability of these models to guide resilience enhancements. Additionally, ways of overcoming the data avail-ability challenges in CIN modelling, with respect to each modelling purpose, remain an open question. To address these challenges, a generic modelling workflow is derived from existing modelling approaches to examine model definition and validations, as well as the six CIN modelling stages, including mapping of infrastructure assets, quantification of dependencies, assessment of natural hazard impacts, response & recovery, quantification of CI services, and adaptation measures. The data requirements of each stage were systematically defined, and the literature on potential sources was reviewed to enhance data collection and raise awareness of potential pitfalls. The application of the derived workflow funnels into a framework to assess data availability challenges. This is shown through three case studies, taking into account their different modelling purposes: hazard hotspot assess-ments, hazard risk management, and sectoral adaptation. Based on the three model purpose types provided, a framework is suggested to explore the implications of data scarcity for certain data types, as well as their reasons and consequences for CIN model reliability. Finally, a discussion on overcoming the challenges of data scarcity is presented.

  • Mohanad Khazaali , Liyang Ma , Keivan Rokneddin , Matteo Mazzotti , Paolo Bocchini

    An accurate estimation of wind loads on telecommunication towers is crucial for design, as well as for perform-ing reliability, resilience, and risk assessments. In particular, drag coefficient and interference factor are the most significant factors for wind load computations. Wind tunnel tests and computational fluid dynamics (CFD) are the most appropriate methods to estimate these parameters. While wind tunnel tests are generally preferred in practice, they require dedicated facilities and personnel, and can be expensive if multiple configurations of tower panels and antennas need to be tested under various wind directions (e.g., fragility curve development for system resilience analysis). This paper provides a simple, robust, and easily accessible CFD protocol with widespread applicability, offering a practical solution in situations where wind tunnel testing is not feasible, such as complex tower configurations or cases where the cost of running experiments for all the tower-antennas configurations is prohibitively high. Different turbulence models, structural and fluid boundary conditions and mesh types are tested to provide a streamlined CFD modeling strategy that shows good convergence and balances accuracy, computational time, and robustness. The protocol is calibrated and validated with experimental studies available in the literature. To demonstrate the capabilities of the protocol, three lattice tower panels and antennas with different configurations are analyzed as examples. The protocol successfully estimates the drag and lateral wind loads and their coefficients under different wind directions. Noticeable differences are observed between the esti-mated wind loads with this protocol and those computed by a simple linear superposition used in most practical applications, indicating the importance of tower-antenna interaction. Also, as expected, the wind loads recom-mended by design codes overestimate the simulated results. More importantly, the telecommunication design codes inadequately identify the most favorable wind directions that are associated with the lowest wind loads, while the results of the proposed protocol align with observations from experimental studies. This information may be used to select the tower orientation before construction. The findings of this study are of importance for the telecommunication industry, which seeks reliable results with minimal computational efforts. In addition, it enhances the fragility analysis of telecommunication towers under strong winds, and the portfolio risk and resilience assessment of telecommunication systems.

  • Sanjeev Bhatta , Xiandong Kang , Ji Dang

    This study examines the feasibility of using a machine learning approach for rapid damage assessment of rein-forced concrete (RC) buildings after the earthquake. Since the real-world damaged datasets are lacking, have limited access, or are imbalanced, a simulation dataset is prepared by conducting a nonlinear time history analy-sis. Different machine learning (ML) models are trained considering the structural parameters and ground motion characteristics to predict the RC building damage into five categories: null, slight, moderate, heavy, and collapse. The random forest classifier (RFC) has achieved a higher prediction accuracy on testing and real-world damaged datasets. The structural parameters can be extracted using different means such as Google Earth, Open Street Map, unmanned aerial vehicles, etc. However, recording the ground motion at a closer distance requires the installation of a dense array of sensors which requires a higher cost. For places with no earthquake recording station/device, it is difficult to have ground motion characteristics. For that different ML-based regressor models are developed utilizing past-earthquake information to predict ground motion parameters such as peak ground acceleration and peak ground velocity. The random forest regressor (RFR) achieved better results than other regression models on testing and validation datasets. Furthermore, compared with the results of similar research works, a better result is obtained using RFC and RFR on validation datasets. In the end, these models are uti-lized to predict the damage categories of RC buildings at Saitama University and Okubo Danchi, Saitama, Japan after an earthquake. This damage information is crucial for government agencies or decision-makers to respond systematically in post-disaster situations.

  • Yating Zhang , Bilal M. Ayyub , Juan F. Fung , Zachary M. Labe

    In the last decade, the detection and attribution science that links climate change to extreme weather and climate events has emerged as a growing field of research with an increasing body of literature. This paper overviews the methods for extreme event attribution (EEA) and discusses the new insights that EEA provides for infrastructure adaptation. We found that EEA can inform stakeholders about current climate risk, support vulnerability-based and hazard-based adaptations, assist in the development of cost-effective adaptation strategies, and enhance justice and equity in the allocation of adaptation resources. As engineering practice shifts from a retrospective approach to a proactive, forward-looking risk management strategy, EEA can be used together with climate projections to enhance the comprehensiveness of decision making, including planning and preparing for un-precedented extreme events. Additionally, attribution assessment can be more useful for adaptation planning when the exposure and vulnerability of communities to past events are analyzed, and future changes in the probability of extreme events are evaluated. Given large uncertainties inherent in event attribution and climate projections, future research should examine the sensitivity of engineering design to climate model uncertainties, and adapt engineering practice, including building codes, to uncertain future conditions. While this study focuses on adaptation planning, EEA can also be a useful tool for informing and enhancing decisions related to climate mitigation.

  • Megan Boston , Desmond Bernie , Liz Brogden , Alan Forster , Laurent Galbrun , Leigh-Anne Hepburn , Taibat Lawanson , Jolanda Morkel

    This paper evaluates literature across multiple disciplines and stakeholder types to identify commonalities and contradictions in definitions for community resilience. It aims to support cross-disciplinary discourse to build an interdisciplinary understanding of community resilience. This work identifies the differences between mono-, multi-, inter-, and cross-disciplinary approaches to inform community resilience strategies in academic and practice-based contexts.

    Four themes for community resilience were identified through a review of cross-disciplinary literature. These include (1) diverse yet convergent definitions of community resilience and the evolution from equilibrium to adaptation to transformation; (2) equitable and inclusive strategies for the development of community resilience initiatives; (3) when and at what scale strategies should be implemented; and (4) community resilience as a process or an outcome.

    This work is valuable to those seeking to familiarise themselves with the concept of community resilience, including educators who deliver courses on community resilience and policy-makers. It is novel in that it presents an interdisciplinary framework for navigating the community resilience discourse beyond individual professional boundaries.