Health insurance mandate

A health insurance mandate is either an employer or individual mandate to obtain private health insurance instead of (or in addition to) a national health insurance plan.[1]

  1. ^ D. Andrew Austin, Thomas L. Hungerford (2010). Market Structure of the Health Insurance Industry Archived July 6, 2024, at the Wayback Machine Congressional Research Service. Library of Congress.