Moment magnitude scale

The moment magnitude scale (MMS; denoted explicitly with M or Mw or Mwg, and generally implied with use of a single M for magnitude[1]) is a measure of an earthquake's magnitude ("size" or strength) based on its seismic moment. Mw was defined in a 1979 paper by Thomas C. Hanks and Hiroo Kanamori. Similar to the local magnitude/Richter scale (ML ) defined by Charles Francis Richter in 1935, it uses a logarithmic scale; small earthquakes have approximately the same magnitudes on both scales. Despite the difference, news media often use the term "Richter scale" when referring to the moment magnitude scale.

Moment magnitude (Mw ) is considered the authoritative magnitude scale for ranking earthquakes by size.[2] It is more directly related to the energy of an earthquake than other scales, and does not saturate – that is, it does not underestimate magnitudes as other scales do in certain conditions.[3] It has become the standard scale used by seismological authorities like the U.S. Geological Survey[4] for reporting large earthquakes (typically M > 4), replacing the local magnitude (ML ) and surface-wave magnitude (Ms ) scales. Subtypes of the moment magnitude scale (Mww , etc.) reflect different ways of estimating the seismic moment.

  1. ^ These are normally not bolded. In the technical literature a single bolded "M" – with or without italicization – is used for several related concepts.[example needed]
  2. ^ Bormann, Wendt & Di Giacomo 2013, p. 86.
  3. ^ Bormann, Wendt & Di Giacomo 2013, p. 18.
  4. ^ The "USGS Earthquake Magnitude Policy" for reporting earthquake magnitudes to the public as formulated by the USGS Earthquake Magnitude Working Group was implemented January 18, 2002, and posted at https://earthquake.usgs.gov/aboutus/docs/020204mag_policy.php. That page was removed following a web redesign; a copy is archived at the Internet Archive.