In survival analysis, the hazard ratio (HR) is the ratio of the hazard rates corresponding to the conditions characterised by two distinct levels of a treatment variable of interest. For example, in a clinical study of a drug, the treated population may die at twice the rate of the control population. The hazard ratio would be 2, indicating a higher hazard of death from the treatment.
For example, a scientific paper might use an HR to state something such as: "Adequate COVID-19 vaccination status was associated with significantly decreased risk for the composite of severe COVID-19 or mortality with a[n] HR of 0.20 (95% CI, 0.17–0.22)."[1] In essence, the hazard for the composite outcome was 80% lower among the vaccinated relative to those who were unvaccinated in the same study. So, for a hazardous outcome (e.g., severe disease or death), an HR below 1 indicates that the treatment (e.g., vaccination) is protective against the outcome of interest. In other cases, an HR greater than 1 indicates the treatment is favorable. For example, if the outcome is actually favorable (e.g., accepting a job offer to end a spell of unemployment), an HR greater than 1 indicates that seeking a job is favorable to not seeking one (if "treatment" is defined as seeking a job).[2]
Hazard ratios differ from relative risks (RRs) and odds ratios (ORs) in that RRs and ORs are cumulative over an entire study, using a defined endpoint, while HRs represent instantaneous risk over the study time period, or some subset thereof. Hazard ratios suffer somewhat less from selection bias with respect to the endpoints chosen and can indicate risks that happen before the endpoint.