Workers' compensation (which formerly was known as workmen's compensation until the name was changed to make it gender neutral) in the United States is a primarily state-based[1] system of workers' compensation.
In the United States, some form of workers compensation is typically compulsory for almost all employers in most states (depending upon the features of the organization), with the notable exception of Texas as of 2018.[2] Regardless of compulsory requirements, businesses may purchase insurance voluntarily, and in the United States policies typically include Part One for compulsory coverage and Part Two for non-compulsory coverage.[3]
By 1949, every state had enacted a workers' compensation program.[4]