# 第40回統計地震学セミナー / The 40th Statistical Seismology Seminar

(Date&Time)
2014年2月18日（火）
/ 18 February, 2014 (Tuesday)　15:00-17:30

(Place)

/ Room D312B @ Institute of Statistical Mathematics
プログラム
(Program)

15:00 –
Bayesian estimation of doubly stochastic Poisson processes: a particle filtering approach
By Varini Elisa (Institute of Applied Mathematics and Information Technology, National Research Council, Italy)

【Abstracts】
We aim to explore the hypothesis that the earthquakes of a seismic region occur under different physical conditions, corresponding to as many seismicity phases characterized by different occurrence rates.
This hypothesis can be modeled by doubly stochastic Poisson processes in which the observed process of the occurrence times of the earthquakes is a point process whose conditional intensity function is assumed to be dependent on both the past history and the current hidden state.
By assuming some of the possible choices for the observed point process and the hidden state process, a Bayesian analysis is carried out in which the likelihood function is approximated by the particle filtering method.

––– Tea break –––

Around 16:30 –
Stochastic Earthquake Models: Ways to Improve and Insights into the Physical Process
By David Harte (Statistics Research Associates Limited (SRA), New Zealand)

【Abstracts】
We present a version of the ETAS model where the offspring rates vary both spatially and temporally. This is in response to deficiencies discussed in[1]. This is achieved by distinguishing between those space-time volumes where the interpoint space-time distances are small, and those where they are considerably larger. In the process of modifying a stochastic　earthquake model, one needs to justify assumptions made, and these in turn raise questions about the nature of the underlying physical process. We will use this version of the ETAS model as the basis for our discussion, and by focussing on aspects where the model does not perform so well, attempt to find physical explanations for such lack of fit. Some possible discussion points are as follows.
What is the nature of the so called background process in the ETAS model? Is it simply a temporal boundary (t=0) correction or does it represent an additional tectonic process not described by the aftershock component? Or are these two alternatives on completely different time scales?
An epidemic (the basic analogy underpinning the ETAS model), or a living organism, can evolve by reproducing offspring that are slightly different to that of their parents - randomness or gene mutation. Certain "modified" individuals will be able to adapt to the environment better and tend to survive over others. In the ETAS context, a lower value of $\alpha$ will cause more "generations" in the aftershock sequence. This allows for a richer and more complex evolution of the process, both spatially and temporally. Alternatively, if alpha is large, then more of the aftershocks are direct offspring of the mainshock. In the epidemic context, this implies that the mainshock contains much more of the "DNA" which governs the evolution of the overall sequence.
What is the relationship between fractal dimension and clustering? Does the fractal dimension provide a better discrimination between those space-time volumes with higher offspring rates and the others? If so, does the fractal dimension provide a more obvious physical description of the difference between these high rate volumes and the lower rate volumes, and hence a suggestive physical explanation?

[1] Harte, D.S. (2013). Bias in Fitting the ETAS Model: A Case Study Based on New Zealand Seismicity. Geophys. J. Int. 192(1), 390-412.