Posted by Petra Challus on March 23, 2003 at 05:37:05:
Hi EQF, Sometimes I think the more you think you know, the more you display how much you don't know. Earthquakes vary by specific area's and thus one has to know the historical seismicity records to determine probabilities. Just as an example, if you wanted to know the percentage of chance for a 3.0 quake in the San Francisco Bay Area, you have to take all of the data and remove the largest quakes and all of their aftershocks in order to determine what the statistical average is. Prior to the Loma Prieta earthquake the average was 1 in every 18 days, however, in the past 14 years it has changed, but to what factor, I am not sure, but we don't have one every 18 days. Its much longer between these, especially if you remove the Geysers Geothermal Area. I do not agree with your assessment that the science community has not worked hard in this area. We have to remember they have collected the data for everyone to use by instrumenting the area's which have a high seismic record. I am certain, though not spoken about on message boards that there are quite a number who do their own homework in looking for the answer as to where the next big one is going to hit and can cite the averages for any location in a short period of time. In California we are blessed with a plethora of data for the largest and most dangerous faults and though long term probabilities are well known, moving those into intermediate (5 years) to short term (days) is not so easy. While seismic gaps give us some good places to look, and the presense of new activity in those area's might give us some clues, the answer today remains elusive. I have a link on my site to Dr. Stefan Weimer's web site and I think if you will go and have a look you will find he has created a very nice school for the average layman. I would give it a week or more to digest the information he presents and perhaps you might find the source a very good reference for you. Petra
|