DocumentCode
2144329
Title
Assessment of Emergency Response Policy Based on Markov Process
Author
Zhou, Yafei ; Liu, Mao ; Hu, Liming
Author_Institution
Center for Urban Public Safety Res., Nankai Univ., Tianjin, China
fYear
2010
fDate
14-16 Aug. 2010
Firstpage
630
Lastpage
634
Abstract
Major accidents not only endanger the health and safety of the population, but also bring bad influence to the environment around. In order to mitigate the adverse effects, emergency response policy (ERP) and corresponding protection action should be established. To maximize its effectiveness, ERP should be evaluated and optimized, while the most important criteria is minimizing the health consequence of the accident. A discrete state stochastic Markov process was used to simulate the movement of the evacuees in this paper. Solution of the Markov process provided the expected distribution of the evacuees in the area as a function of time. Then, according to the way how extreme phenomena impact individual and the dose-response relationship, the people´s health effects were calculated, so that the accident´s health consequence was determined. Finally, different emergency response policies were evaluated with corresponding health consequence, so that the emergency policy can be optimized.
Keywords
Markov processes; accidents; emergency services; health and safety; socio-economic effects; ERP; accidents; discrete Markov process; emergency response policy assessment; evacuees distribution; public health; Accidents; Atmospheric modeling; Emergency services; Markov processes; Numerical models; Vehicles; emergency response policy; health consequence; stochastic Markov evacuation model;
fLanguage
English
Publisher
ieee
Conference_Titel
Granular Computing (GrC), 2010 IEEE International Conference on
Conference_Location
San Jose, CA
Print_ISBN
978-1-4244-7964-1
Type
conf
DOI
10.1109/GrC.2010.85
Filename
5576018
Link To Document