Part of a series on |
Earthquakes |
---|
The moment magnitude scale (MMS; denoted explicitly with M or Mw or Mwg, and generally implied with use of a single M for magnitude[1]) is a measure of an earthquake's magnitude ("size" or strength) based on its seismic moment. Mw was defined in a 1979 paper by Thomas C. Hanks and Hiroo Kanamori. Similar to the local magnitude/Richter scale (ML ) defined by Charles Francis Richter in 1935, it uses a logarithmic scale; small earthquakes have approximately the same magnitudes on both scales. Despite the difference, news media often use the term "Richter scale" when referring to the moment magnitude scale.
Moment magnitude (Mw ) is considered the authoritative magnitude scale for ranking earthquakes by size.[2] It is more directly related to the energy of an earthquake than other scales, and does not saturate – that is, it does not underestimate magnitudes as other scales do in certain conditions.[3] It has become the standard scale used by seismological authorities like the U.S. Geological Survey[4] for reporting large earthquakes (typically M > 4), replacing the local magnitude (ML ) and surface-wave magnitude (Ms ) scales. Subtypes of the moment magnitude scale (Mww , etc.) reflect different ways of estimating the seismic moment.