統計数理研究所

第19回統計地震学セミナー

日時
2011年3月4日(金), 15:00〜16:20
場所
統計数理研究所 D312A室
【講演1】
 
題目
Analysis of Earthquake Inter-event Times Spealer:)
講演者
Abdelhak Talbi (東京大学地震研)
概要
Understanding temporal behavior of earthquakes is a fundamental step towards building reliable statistical model fitting observed seismicity. A successful class of models assume two seismicity components corresponding to stable ‘background’ and varying ‘triggered’ rates or inter-event times. In this study, the distribution of inter-event times is modeled assuming triggered events governed by a non-homogenous Poisson process, and background events governed by different hypothetical distribution (Exponential, Gamma and Weibull). The model is analytically introduced using Palm-Khinchine equations and fitted in practice to seismicity data from southern California, Japan and Turkey. The analytic form of the distribution is discussed when different priory hypotheses are adopted. In a second step, the temporal clustering of events is studied using the distance between the whole distribution of inter-event times, and the residual distributions obtained using different declustering approaches. Short and long range correlations are studied in space and time. The residual background process is found dominant around the mean inter-event time and the mean inter-event distance. The former analysis describes seismicity as the accumulation of local perturbations related to a unique mean field ‘background’ processes characterized by the mean inter-event time and the mean inter-distance.
【講演2】
 
題目
Short-term earthquake forecasting before and during the L’Aquila (Central Italy) seismic sequence of April 2009
講演者
Rodolfo Console(イタリア国立地球物理学と火山学研究所)
概要
The M5.9 earthquake occurred on April 6th 2009, which caused more than 300 casualties in the city of L’Aquila and neighboring villages in Central Italy, immediately generated a lot of discussions about the potential practical use of foreshocks and other kind of information for mitigating seismic risk among the population. These discussions triggered studies related to the validity of statistical clustering models such as the ETAS model, not only for forecasting aftershocks, but also mainshocks following potential foreshocks. In the frame of the above mentioned studies, this presentation reports preliminary results of the statistical analysis of the L’Aquila seismic sequence by means of a version of the group of ETAS models. The free parameters used in the algorithm are obtained through the maximum likelihood method from a learning data set of instrumental seismicity collected from 2005 up to March 2009 in the region of L’Aquila. Our method includes statistical declustering of the background seismicity by an iterative process until the maximum likelihood of the learning data set under the ETAS model is obtained. For testing purposes, an algorithm for producing simulations of seismic series has been developed and applied to produce synthetic catalogues, the statistical properties of which are compared with those of the real one. Finally, the daily forecasts of earthquakes at different threshold magnitudes were produced for a testing period including the L’Aquila 2009 mainshocks and its largest aftershocks. The results show that the probability of occurrence of an M5.9 computed from the ETAS algorithm at the midnight preceding the L’Aquila 2009 mainshock, even if it was much higher than the background Poisson probability, was quite low if compared with reasonable expectations for a practical operational forecast. Moreover, the comparison between the daily rate expected by the ETAS forecast method, and the real daily number of aftershocks shows a systematic underestimation of such rate.
▲ このページのトップへ