Dental insurance

Dental insurance is a form of health insurance designed to pay a portion of the costs associated with dental care.